tag:blogger.com,1999:blog-62739302024-03-08T07:33:08.563-04:00Canadian Privacy Law BlogThe Canadian Privacy Law Blog: Developments in privacy law and writings of a Canadian privacy lawyer, containing information related to the Personal Information Protection and Electronic Documents Act (aka PIPEDA) and other Canadian and international laws.privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.comBlogger3713125tag:blogger.com,1999:blog-6273930.post-47877408205454356332024-03-04T08:40:00.001-04:002024-03-04T08:40:45.508-04:00Canada's New "Online Harms" bill - and overview and a few critiques<P><iframe width="720" height="480" src="https://www.youtube.com/embed/K2D4WivpPMY" title="Canada's New "Online Harms" bill - and overview and a few critiques" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
<p> <span style="font-family: Arial, sans-serif; font-size: 11pt; white-space-collapse: preserve;">It is finally here: the long-anticipated <a href="https://www.parl.ca/LegisInfo/en/bill/44-1/C-63" target="_blank">Online Harms bill</a>. It was tabled in Parliament on February 26, 2024 as Bill C-63. It is not as bad as I expected, but it has some serious issues that need to be addressed if it is going to be Charter-compliant. It also has some room for serious improvement and it represents a real missed opportunity in how it handles “deepfakes”, synthetic explicit images and videos.</span></p><span id="docs-internal-guid-47d19501-7fff-c713-f343-5c3cf692979a"><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The bill is 104 pages long and it was just released, so this will be a high level overview and perhaps incomplete. But I will also focus on some issues that leapt out to me on my first few times reading it.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">In a nutshell, it does a better job than the <a href="https://www.canada.ca/en/canadian-heritage/campaigns/harmful-online-content/technical-paper.html " target="_blank">discussion paper first floated years ago</a> by not lumping all kinds of “online harms” into one bucket and treating them all the same. This bill more acutely addresses child abuse materials and non-consensual distribution of intimate images. I think the thresholds for some of this are too low, resulting in removal by default. The new Digital Safety Commission has stunning and likely unconstitutional powers. As is often the case, there’s too much left to the regulations. But let’s get into the substance.</span></p><br /><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial, sans-serif; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Who does it apply to?</span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">So what does it do and who does it apply to? It applies to social media companies that meet a particular threshold that’s set in regulation. Social media companies are defined as:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-style: italic; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; font-weight: 700; vertical-align: baseline; white-space-collapse: preserve;">social media service</span><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"> means a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content. (service de média social)</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">It also specifically includes: (a) an adult content service, namely a social media service that is focused on enabling its users to access and share pornographic content; and (b) a live streaming service, namely a social media service that is focused on enabling its users to access and share content by live stream.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">This seems intended to capture sites like PornHub and OnlyFans, but I think there are arguments that could be made to say that they'll not fit within that definition. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">It specifically excludes services that do not permit a user to communicate to the public (s. 5(1)) and carves out private messaging features. So instead of going after a very long list of service providers, it is much more focused, but this can be tailored by the minister by regulation. </span></p><br /><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial, sans-serif; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">New bureaucracy</span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The online news act creates a whole new regulatory bureaucracy, which includes the Digital Safety Commission, the Digital Safety Ombudsperson and the Digital Safety Office. The Digital Safety Commission is essentially the regulator under this legislation and I'll talk a little bit later about what that its role is. The Ombudsperson is more of an advocate for members of the public and the Digital Safety Office is the bureaucracy that supports them both. As an aside, why call the bill the “Online Harms Act” but call the Commission the “Online Safety Commission”? We have a Privacy Act and a Privacy Commissioner. We have a Competition Act and a Competition Commissioner. We have a Human Rights Act and a Human Rights Commissioner. In this bill, it’s just inelegant. </span></p><br /><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial, sans-serif; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Duty to act responsibly</span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The legislation will impose a duty to act responsibly with respect to harmful content by implementing processes and mitigation measures that have to be approved by the Digital Safety Commissioner of Canada. This is extremely open-ended and there is no guarantee or assurance that this will be compatible with the digital safety schemes that these companies would be setting up in order to comply with the laws of other jurisdictions. We need to be very careful that “made in Canada Solutions” don't result in requirements that are disproportionately burdensome in light of our market size. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The large social media companies that immediately come to mind already have very robust digital safety policies and practices, so whatever is dictated by the Digital Safety Commissioner should be based on existing best practices and not trying to reinvent the wheel.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">If you are a very large social media company, you likely are looking to comply with the laws of every jurisdiction where you are active. Canada is but a drop in the internet bucket and work done by organizations to comply with European requirements should be good enough for Canada. If the cost of compliance is too onerous, service providers will look to avoid Canada, or will adopt policies of removing everything that everyone objects to. And the Social Media companies will be required to pay for the new digital bureaucracy, so that adds significantly to their cost of doing business in Canada.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">In addition to having to have government approved policies, the Bill does include some mandatory elements like the ability of users to block other users and flag harmful content. They also have to make a “resource person” available to users to hear concerns, direct them to resources and provide guidance on the use of those resources. </span></p><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial, sans-serif; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Age appropriate design code</span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">One thing that I was blown away by is largely hidden in section 65. It reads …</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"><b>Design features</b></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">65 An operator must integrate into a regulated service that it operates any design features respecting the protection of children, such as age appropriate design, that are provided for by regulations.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">I was blown away by this for two reasons. The first is that it gives the government the power to dictate potentially huge changes or mandatory elements of an online service. And they can do this by simple regulation. Protecting children is an ostensible motive – but often a pretext – for a huge range of legislative and regulatory actions, many of which overreach. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The second reason why I was blown away by this is that it could amount to an “Age Appropriate Design Code”, via regulation. In the UK, the Information Commissioner’s Office carried out massive amounts of consultation, research and discussion before developing the UK’s age appropriate design code. In this case, the government can do this with a simple publication in the Canada Gazette. </span></p><br /><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial, sans-serif; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Harmful content</span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">A lot of this Bill turns on “what is harmful content”? It is defined in the legislation as seven different categories of content, each of which has its own specific definition. they are.. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(a) intimate content communicated without consent;</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(b) content that sexually victimizes a child or revictimizes a survivor;</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(c) content that induces a child to harm themselves;</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(d) content used to bully a child;</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(e) content that foments hatred;</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(f) content that incites violence; and</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(g) content that incites violent extremism or terrorism. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Importantly, the bill treats the first two types of harmful content as distinct from the rest. This actually makes a lot of sense. Child sexual abuse materials are already illegal in Canada and is generally easy to identify. I am not aware of any social media service that will abide that sort of content for a second. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The category of content called “intimate content communicated without consent” is intended to capture what is already illegal in the Criminal Code related to the non-consensual distribution of intimate images. The definition in the online harms bill expands on that to incorporate what are commonly called “deepfakes”. These are images depicting a person in an explicit manner that are either modifications of existing photographs or videos, or are completely synthetic as the result of someone's imagination or with use of artificial intelligence.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">I 100% support including deepfake explicit imagery in this Bill and I would also 100% support including it in the Criminal Code given the significant harm that it can cause to victims, but only if the definition is properly tailored. In the Online Harms bill, the definition is actually problematic and potentially includes any explicit or sexual image. Here is the definition, and note the use of “reasonable to suspect”. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-style: italic; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; font-weight: 700; vertical-align: baseline; white-space-collapse: preserve;">intimate content communicated without consent</span><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"> means</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(a) a visual recording, such as a photographic, film or video recording, in which a person is nude or is exposing their sexual organs or anal region or is engaged in explicit sexual activity, if it is </span><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; font-weight: 700; vertical-align: baseline; white-space-collapse: preserve;">reasonable to suspect</span><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"> that</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 72pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(i) the person had a reasonable expectation of privacy at the time of the recording, and</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 72pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(ii) the person does not consent to the recording being communicated; and</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(b) a visual recording, such as a photographic, film or video recording, that falsely presents in a reasonably convincing manner a person as being nude or exposing their sexual organs or anal region or engaged in explicit sexual activity, including a deepfake that presents a person in that manner, if it is </span><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; font-weight: 700; vertical-align: baseline; white-space-collapse: preserve;">reasonable to suspect</span><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"> that the person does not consent to the recording being communicated. (contenu intime communiqué de façon non consensuelle)</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">So what is the problem? The problem is that the wording “reasonable grounds to suspect" cannot be found in the Criminal Code definition for this type of content and there is a very good reason for that. Either content is consensual or it is not. In the Criminal Code at section 162.1, it reads:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(2) In this section, "intimate image" means a visual recording of a person made by any means including a photographic, film or video recording,</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(a) in which the person is nude, is exposing his or her genital organs or anal region or her breasts or is engaged in explicit sexual activity;</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(b) in respect of which, at the time of the recording, there were circumstances that gave rise to a reasonable expectation of privacy; and</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(c) in respect of which the person depicted retains a reasonable expectation of privacy at the time the offence is committed.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">In the Criminal Code, either there is consent or there is not. In this Bill, the threshold is the dramatically low “reasonable to suspect”. All you need is a reasonable suspicion and it is not just with respect to the circumstances at the time the image was taken or created, assuming we're dealing with an actual person and an actual image. The courts have said </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The words “to suspect” have been defined as meaning to “believe tentatively without clear ground” and “be inclined to think” ... suspicion involves “an expectation that the targeted individual is possibly engaged in some criminal activity. A ‘reasonable’ suspicion means something more than a mere suspicion and something less than a belief based upon reasonable and probable grounds”.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">You can be 85% confident that it is consensual, but that remaining 15% results in reasonable suspicion that it is not. When you're dealing with the section related to purported deep fakes, it does not specify that the image has to be of an actual person, whether synthetic or not. It could in fact be a completely fictional person that has been created using photoshop. It would cause no risk of harm to anyone. Given that the image is artificial and the circumstances of its creation are completely unknown, as is the person supposedly depicted in it, you can't help but have reasonable grounds to suspect that it “might” have been communicated nonconsensually. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Deepfakes of actual people created using artificial intelligence is a real thing and a real problem. But artificial intelligence is actually better at creating images and videos of fake people. You should not be surprised that it is being used to create erotic or sexual content of AI-generated people. While it may not be your cup of tea, it is completely lawful. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">And it actually gets even worse, because with respect to deepfakes, the Online Harms Act turns on whether the actual communication itself may have been without consent, not the creation of the image. Setting aside for a moment that a fictional person can never consent and can ever withhold consent, an example immediately comes to mind drawn directly from Canada's history of bad legislation related to technology and online mischief.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">People may recall that a number of years ago, Nova Scotia passed a law called the Cyber-safety Act which was intended to address online bullying. It was so poorly drafted that it was ultimately found to be unconstitutional and thrown out.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">During the time when that law was actually enforced, we had an incident in Nova Scotia where two young people discovered that their member of the legislature had previously had a career as an actor. As part of that career, she appeared in a cable television series that was actually quite popular and in at least a couple scenes, she appeared without her top on. These foolish young men decided to take a picture from the internet, and there were hundreds of them to choose from, and tweets it. What happened next? This politician got very mad and contacted the Nova Scotia cyber cops, who threatened the young man with all sorts of significant consequences.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">That image, which was taken in a Hollywood studio, presumably after the actor had signed the usual releases, would potentially fit into this category of harmful content if it were tweeted after the Online Harms Act comes into effect because someone reviewing it on behalf of a platform after it had been flagged would have no idea where the image came from. And if anyone says it’s non-consensual, that’s enough to create reasonable suspicion. One relatively explicit scene actually looks like it was taken with a hidden camera. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Surely, it cannot be the intention of the minister of Justice to regulate that sort of thing. In some ways, it doesn't matter because it would likely be found to be a violation of our freedom of expression, right under section 2B of the charter rights and freedoms, which cannot be justified under section 1 of the charter.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">But wait, it gets worse. With respect to the two special categories of harmful content, operators of social media services have an obligation to put in place a flagging mechanism so that objectionable content can be flagged by users. If there are reasonable grounds to believe that the content that has been flagged fits into one of those two categories, they must remove it. Reasonable grounds to believe is also a very low standard. But when you combine the two, the standard is so low that it is in the basement. Reasonable grounds to believe that there are reasonable grounds to suspect is such a low standard that it is probably unintelligible.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Deep fake images are a real, real problem. When a sexually explicit, but synthetic image of a real person is created, it has significant impacts on the victim. If they were doing anything other than window dressing, they would have paid very close attention to the critical definitions and how it is handled. But they have created a scheme in which anything that it's explicit could fit into this category by anybody, rendering the whole thing liable to be thrown out as a violation of the charter, thereby further victimizing vulnerable victims. Victims. And if they had gotten the definition right, which they clearly did not, little code because the harm associated with the dissemination of explicit deep fakes is similar to the harm associated with the already criminalized non-consensual distribution of actual intimate images.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">It actually gets even worse, because the digital safety commissioner can get involved and they can order the removal of contents. The removal of content is again based on simple, reasonable grounds to believe that the material is within that category, which again only requires a reasonable ground to suspect a lack of consent. This is a government actor ordering the removal of expressive contents that unquestionably engages the freedom of expression right. Where you have a definition that is so broad that it can include content that does not post any risk of harm to any individual, that definition cannot be upheld as Charter compliant.</span></p><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial, sans-serif; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Flagging process</span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">If a user flags content as either sexually victimizing a child or as intimate content communicated without consent, the operator has to review it within 24 hours. The operator can only dismiss the flag if it’s trivial, frivolous, vexatious or made in bad faith; or has already been dealt with. If not dismissed, they MUST block it and make it inaccessible to people in Canada. If they block it – which is clearly the default – they have to give notice to the person who posted it and to the flagger, and give them an opportunity to make representations. What this timeline is will be in the regulations. Based on those representations, the operator must decide whether there are reasonable grounds to believe the content is that type of harmful content, and if so, they have to make it inaccessible to persons in Canada. Section 68(4) says they’d have to continue to make it inaccessible to all persons in Canada, which suggests to me they have to have a mechanism to make sure it is not reposted. There is a reconsideration process, which is largely a repeat of the original flag and review process. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">One thing that I find puzzling is that this mechanism is mandatory and does not seem to permit the platform operator from doing their usual thing, which is to review material posted on their platform and simply removing it if they are of the view that it violates their platform policies. If it is clearly imagery that depicts child sexual abuse, they should be able to remove it without notice or involving the original poster. </span></p><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial, sans-serif; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Information grab</span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Each regulated operator has to submit a “digital safety plan” to the Digital Safety Commissioner. The contents of this are enormous. It’s a full report on everything the operator does to comply with the Act, and also includes information on all the measures used to protect children, preventing harmful content, statistics about flags and takedowns (broken down by category of content), resources allocated by the operator to comply, and information respecting content, other than “harmful content”, that was moderated by the operator and that the operator had reasonable grounds to believe posed a “risk of significant psychological or physical harm.” But that’s not all … it also includes information about complaints, concerns heard and any research the operator has done related to safety on their platform. And, of course, “any other information provided for by regulations.” And most of this also has to be published on the operator’s platform. </span></p><br /><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial, sans-serif; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Researchers’ information grab </span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The Commission can accredit people (other than individuals) to access electronic data in digital safety plans. These people must be conducting research, education, advocacy, or awareness activities related to the purposes of the act. The Commission can grant access to these inventories and suspend or revoke accreditation if the person doesn't comply with the conditions. Accredited people can also request access to electronic data in digital safety plans from regulated service operators and the Commission can order that the operator provide the data. However, this access is only allowed for research projects related to the act's purposes.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">This is another area where the parameters, which are hugely important, will be left to the regulations. There’s no explicit requirement that the accredited researcher have their research approved by a Canadian research ethics board. It’s all left to the regulations. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">We need to remember that “Cambridge Analytica” got their data from a person who purported to be an academic researcher. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">If the operator of a regulated service affected by an order requests it, the Commission may consider changing or canceling the order. The Commission may do so if it finds, according to the criteria in the regulations, that the operator can't comply with the order or that doing so would cause the operator undue hardship. An accredited person who requested an order may complain to the Commission if the operator subject to the order fails to comply. The Commission must give the operator a chance to make representations. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Finally, the Commission may publish a list of accredited people and a description of the research projects for which the Commission has made an order.</span></p><br /><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial, sans-serif; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Submissions from the public</span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The Act contains a mechanism by which any person in Canada may make a submission to the Commission respecting harmful content that is accessible on a regulated service or the measures taken by the operator of a regulated service to comply with the operator’s duties under the Act. The Commission can provide information about the submission to the relevant operator and there are particular provisions to protect the identity of any employees of an operator that make a submission to the Commission. </span></p><br /><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial, sans-serif; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Complaints to the Commission</span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The real enforcement powers of the Commission come into play in Part 6 of the Act. Any person in Canada may make a complaint to the Commission that content on a regulated service is content that sexually victimizes a child or revictimizes a survivor or is intimate content communicated without consent. These are the particularly acute categories of deemed “harmful content.”</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The Commission has to conduct an initial assessment of the complaint and dismiss it if the Commission is of the opinion that it is trivial, frivolous, vexatious or made in bad faith; or has otherwise been dealt with. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">If the complaint is not dismissed, the Commission must (not may) give notice of the complaint to the operator and make an order requiring the operator to, without delay, make the content inaccessible to all persons in Canada and to continue to make it inaccessible until the Commission gives notice to the operator of its final decision. This is an immediate takedown order without any substantial consideration of the merits of the complaint. All they need is a non-trivial complaint. I don’t mind an immediate takedown if one reasonably suspects the content is child sexual abuse material, but the categories are broader than that.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The operator must ask the user who posted the content on the service whether they consent to their contact information being provided to the Commission. If the user consents, the operator must provide the contact information to the Commission. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">“Hey, you’re being accused of posting illegal content on the internet, do you want us to give your information to the Canadian government?”</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The Commission must give the complainant and the user who communicated the content on the service an opportunity to make representations as to whether the content is content that fits into those categories of harmful content. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Now here is where the rubber hits the road: The Commission must decide whether there are “reasonable grounds to believe” that the content fits into those categories. In a criminal court, the court would have to consider whether the content fits the definition, beyond a reasonable doubt. In a civil court, the court would have to consider whether the content fits the definition, on a balance of probabilities. Here, all the Commission needs to conclude is whether there are “reasonable grounds to believe.” If they do, they issue an order that it be made permanently inaccessible to all persons in Canada.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">That is a dramatically low bar for permanent removal. Again, I’m not concerned about it being used with material that is child abuse imagery or is even reasonably suspected to be. But there is a very strong likelihood that this will capture content that really is not intimate content communicated without consent. Recall the definition, and the use of “reasonable to suspect”:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-style: italic; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; font-weight: 700; vertical-align: baseline; white-space-collapse: preserve;">intimate content communicated without consent</span><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"> means</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(a) a visual recording, such as a photographic, film or video recording, in which a person is nude or is exposing their sexual organs or anal region or is engaged in explicit sexual activity, if it is </span><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; font-weight: 700; vertical-align: baseline; white-space-collapse: preserve;">reasonable to suspect</span><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"> that</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 72pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(i) the person had a reasonable expectation of privacy at the time of the recording, and</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 72pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(ii) the person does not consent to the recording being communicated; and</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(b) a visual recording, such as a photographic, film or video recording, that falsely presents in a reasonably convincing manner a person as being nude or exposing their sexual organs or anal region or engaged in explicit sexual activity, including a deepfake that presents a person in that manner, if it is </span><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; font-weight: 700; vertical-align: baseline; white-space-collapse: preserve;">reasonable to suspect</span><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"> that the person does not consent to the recording being communicated. (contenu intime communiqué de façon non consensuelle)</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">To order a permanent takedown, the Commission just needs to conclude there are reasonable grounds to believe that it is “reasonable to suspect” a lack of consent. There’s no requirement for the complainant to say “that’s me and I did not consent to that.” Unless you know the full context and background of the image or video, and know positively that there WAS consent, there will almost always be grounds to suspect that there wasn’t. And remember that the deepfake provision doesn’t specifically require that it be an actual living person depicted. It could be a complete figment of a computer’s imagination, which is otherwise entirely lawful under Canadian law. But it would still be ordered to be taken down. </span></p><br /><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial, sans-serif; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The Commission’s vast powers</span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The Commission has vast, vast powers. They’re breathtaking, actually. These are set out in Part 7 of the Act. Here’s part of these powers:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">86 In ensuring an operator’s compliance with this Act or investigating a complaint made under subsection 81(1), the Commission may, in accordance with any rules made under subsection 20(1),</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(a) summon and enforce the appearance of persons before the Commission and compel them to give oral or written evidence on oath and to produce any documents or other things that the Commission considers necessary, in the same manner and to the same extent as a superior court of record;</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(b) administer oaths;</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(c) receive and accept any evidence or other information, whether on oath, by affidavit or otherwise, that the Commission sees fit, whether or not it would be admissible in a court of law; and</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(d) decide any procedural or evidentiary question.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">And check out these “Rules of evidence” (or absence of rules of evidence) for the Commission:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">87 The Commission is not bound by any legal or technical rules of evidence. It must deal with all matters that come before it as informally and expeditiously as the circumstances and considerations of fairness and natural justice permit.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">If the Commissioner holds a hearing – which is entirely in its discretion to determine when a hearing is appropriate – it must be held in public unless it isn’t. There’s a laundry list of reasons why it can decide to close all or part of a hearing to the public. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">I don’t expect we’ll see hearings for many individual complaints.</span></p><br /><h2 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 18pt;"><span style="font-family: Arial, sans-serif; font-size: 13pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Inspectors</span></h2><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The next part is staggering. In section 90, the Commission can designate “inspectors” who get a “certificate of designation”. Their powers are set out in section 91. Without a warrant and without notice, an inspector can enter any place in which they have reasonable grounds to believe that there is any document, information or other thing relevant to that purpose. Once they’re in the business, they can </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(a) examine any document or information that is found in the place, copy it in whole or in part and take it for examination or copying;</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(b) examine any other thing that is found in the place and take it for examination;</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(c) use or cause to be used any computer system at the place to examine any document or information that is found in the place;</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(d) reproduce any document or information or cause it to be reproduced and take it for examination or copying; and</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">(e) use or cause to be used any copying equipment or means of telecommunication at the place to make copies of or transmit any document or information.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">They can force any person in charge of the place to assist them and provide documents, information and any other thing. And they can bring anybody else they think is necessary to help them exercise their powers or perform their duties and functions.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">There’s also a standalone requirement to provide information or access to an inspector:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">93 An inspector may, for a purpose related to verifying compliance or preventing non-compliance with this Act, require any person who is in possession of a document or information that the inspector considers necessary for that purpose to provide the document or information to the inspector or provide the inspector with access to the document or information, in the form and manner and within the time specified by the inspector.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Holy crap. Again, no court order, no warrant, no limit, no oversight.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">It’s worth noting that most social media companies don’t operate out of Canada and international law would prevent an inspector from, for example, going to California and inspecting the premises of a company there. </span></p><br /><h2 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 18pt;"><span style="font-family: Arial, sans-serif; font-size: 13pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Compliance orders</span></h2><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The Act grants the Commission staggeringly broad powers to issue “Compliance orders”. All these orders need is “reasonable grounds to believe”. There’s no opportunity for an operator to hear the concerns, make submissions and respond. And what can be ordered is virtually unlimited. There is no due process, no oversight, no appeal of the order and the penalty for contravening such an order is enormous. It’s up to the greater of $25 million or 8% of the operator’s global revenue. If you use Facebook’s 2023 global revenue, that ceiling is $15 BILLION dollars. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">94 (1) If the Commission has reasonable grounds to believe that an operator is contravening or has contravened this Act, it may make an order requiring the operator to take, or refrain from taking, any measure to ensure compliance with this Act.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">This is a breathtaking power, without due process, without a hearing, without evidence and only on a “reasonable grounds to believe”. And what can be ordered is massively open-ended. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">You may note that section 124 of the Act says that nobody can be imprisoned in default of payment of a fine under the Act. The reason for this is to avoid due process. Under our laws, if there’s a possibility of imprisonment, there is a requirement for higher due process and procedural fairness. It’s an explicit decision made, in my view, to get away with a lower level of due process. </span></p><br /><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial, sans-serif; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Who pays for all this?</span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The Act makes the regulated operators pay to fund the costs of the Digital Safety Commission, Ombudsperson, and Office. Certainly it has some good optics that the cost of this new bureaucracy will not be paid from the public purse, but I expect that any regulated operator will be doing some math. If the cost of compliance and the direct cost of this “Digital Safety Tax” is sufficiently large, they may think again about whether to continue to provide services in Canada. We saw with the Online News Act that Meta decided the cost of carrying links to news was greater than the benefit they obtained by doing so, and then rationally decided to no longer permit news links in Canada. </span></p><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial, sans-serif; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Amendments to the Criminal Code and the Canada Human Rights Act </span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Finally, I agree with other commentators in reaching the conclusion that bolting on amendments to the Criminal Code and the Canada Human Rights Act was a huge mistake and will imperil any meaningful discussion of online safety. Once again, the government royally screwed up by including too much in one bill.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">The bill makes significant additions to the Criminal Code. Hate propaganda offenses carry harsher penalties. The bill defines "hatred" (in line with Supreme Court of Canada jurisprudence) and creates a new hate crime: "offense motivated by hatred."</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Moreover, Bill C-63 amends the Canadian Human Rights Act. It adds "communication of hate speech" through the Internet or similar channels as discriminatory practice. These amendments give individuals the right to file complaints with the Canadian Human Rights Commission which, in turn, can impose penalties of up to $20,000. However, these changes concern user-to-user communication, not social media platforms, broadcast undertakings, or telecommunication service providers.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Bill C-63 further introduces amendments related to the mandatory reporting of child sexual abuse materials. They clarify the definition of "Internet service" to include access, hosting, and interpersonal communication like email. Any person providing an Internet service to the public must send all notifications to a designated law enforcement body. Additionally, the preservation period for data related to an offense is extended.</span></p><br /><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial, sans-serif; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Conclusion</span></h1><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">All in all, it is not as bad as I expected it to be. But it is not without its serious problems. Given that the discussion paper from a number of years ago was a potential disaster and much of that has been improved via the consultation process, I have some hope that the government will listen to those who want to – in good faith – improve the bill. That may be a faint hope, but unless it’s improved, it will likely be substantially struck down as unconstitutional</span></p><div><span style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"><br /></span></div></span><div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-32634324717045847952024-02-05T11:05:00.006-04:002024-02-05T11:05:57.911-04:00Canadian Bill S-210 proposes age verification for internet users<p><iframe width="640" height="480" src="https://www.youtube.com/embed/UN8eP6LlWVY" title="Your papers, please! Canadian Bill S-210 proposes age verification law for internet users." frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe><br /></p><p>There’s a <a href="https://www.parl.ca/legisinfo/en/bill/44-1/s-210" target="_blank">bill</a> working its way through the Parliament that presents a clear and present danger to the free and open internet, to freedom of expression and to privacy online. It’s a private member’s bill that shockingly has gotten traction. </p><p>You may have heard of it, thanks to Professor Michael Geist, who has called the Bill “<a href="https://www.michaelgeist.ca/2023/12/the-most-dangerous-canadian-internet-bill-youve-never-heard-of-is-a-step-closer-to-becoming-law/" target="_blank">the Most Dangerous Canadian Internet Bill You’ve Never Heard Of</a>.”</p><p>In a nutshell, it will require any website on the entire global internet that makes sexually explicit material available to verify the age of anyone who wants access, to ensure that they are not under the age of eighteen. Keeping sexually explicit material away from kids sounds like a laudable goal and one that most people can get behind. </p><p>The devil, as they say, is in the details. It presents a real risk to privacy, a real risk to freedom of expression and a real danger to the open internet in Canada. The author of the Bill says it does none of that, but I believe she is mistaken.</p><p>The bill was introduced in the Senate of Canada in November 2021 by <a href="https://sencanada.ca/en/senators/miville-dechene-julie/" target="_blank">Senator Julie Miville-Dechêne</a>. She is an independent senator, appointed by Prime Minister Justin Trudeau in 2018. Much of her career was as a journalist, which makes her obliviousness of the freedom of expression impact of her bill puzzling. I don’t think she’s acting in bad faith, but I think she’s mistaken about the scope and effect of her Bill. </p><p>In 2022, the Bill was considered by the Senate Standing Committee on Legal and Constitutional Affairs. That Committee reported it back to the Senate in November 2022, and it languished until it passed third reading in April 2023 and was referred to the House of Commons. Many people were surprised when the House voted in December 2023 to send it for consideration before the Standing Committee on Public Safety and National Security. Every Conservative, Block and NDP member present voted in favour of this, while most Liberals voted against it. Suddenly, the Bill had traction and what appeared to be broad support among the opposition parties. </p><p>So what does the bill do and why is it problematic? Let’s go through it clause by clause. </p><p style="text-align: left;">The main part of it – the prohibition and the offence – is contained in section 5. It creates an offence of “making available” “sexually explicit material” on the Internet to a young person. This incorporates some defined terms, from section 2. </p><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><p style="text-align: left;"><b>Making sexually explicit material available to a young person</b></p><p style="text-align: left;">5 Any organization that, for commercial purposes, makes available sexually explicit material on the Internet to a young person is guilty of an offence punishable on summary conviction and is liable,</p></blockquote><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><p style="text-align: left;">(a) for a first offence, to a fine of not more than $250,000; and</p></blockquote><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><p style="text-align: left;">(b) for a second or subsequent offence, to a fine of not more than $500,000.</p></blockquote></blockquote><p>“Making available” is incredibly broad. When a definition says “”includes”, it means that it can mean more than the terms that follow. “Transmitting” is a very, very broad term. Is that intended to cover the people who operate the facilities over which porn is transmitted? It is very broad. </p><p>A “young person” is a person under the age of 18. That’s pretty clear. </p><p>The definition of “sexually explicit material” is taken from the Criminal Code. It should be noted that this definition was created and put in the Criminal Code for a particular purpose. This is not a catch-all offence that makes it illegal to make sexually explicit material available to a young person. This is an element of an offence, where the purpose of providing this material to a young person is to facilitate another offence against a young person. Essentially, grooming a young person. </p><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><p><b>Definition of sexually explicit material</b></p><p>(5) In subsection (1), sexually explicit material means material that is not child pornography, as defined in subsection 163.1(1), and that is</p><p>(a) a photographic, film, video or other visual representation, whether or not it was made by electronic or mechanical means,</p></blockquote><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><p>(i) that shows a person who is engaged in or is depicted as engaged in explicit sexual activity, or</p></blockquote><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><p>(ii) the dominant characteristic of which is the depiction, for a sexual purpose, of a person’s genital organs or anal region or, if the person is female, her breasts;</p></blockquote></blockquote><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><p>(b) written material whose dominant characteristic is the description, for a sexual purpose, of explicit sexual activity with a person; or</p><p>(c) an audio recording whose dominant characteristic is the description, presentation or representation, for a sexual purpose, of explicit sexual activity with a person.</p></blockquote><p>To be clear, it is not a crime to make this sort of material available to a young person unless you’re planning further harm to the young person. </p><p>Let’s look at what is included in this definition. Visual, written or audio depictions of explicit activity. And visual depictions of certain body parts or areas, if it’s done for a sexual purpose. </p><p>In paragraph 5(a)(i), it does not say that the depiction has to be explicit. It says the activity in which a person is engaged is explicit. </p><p>Let’s take a moment and let this sink in. This is not limited to porn sites. </p><p>This sort of material is broadcast on cable TV. It’s certainly available in adult book stores (which specialize in certain types of publications), but it’s also available in general book stores. This sort of material is available in every large library in Canada. </p><p>This definition would include educational materials. </p><p>This definition is so broad that it covers wikipedia articles related to art, reproduction and sexual health. </p><p>It is certainly not limited to materials that would cause a reasoned risk of harm to a young person. And it doesn’t take any account of the different maturity levels of young people. The sex ed curriculum is very different for 14 year olds, 16 year olds and 18 year olds. </p><p>Section 6 is where the government mandated age verification technology comes in. Essentially, you can’t say that you thought you were only providing access to the defined material to adults. You have to implement a government prescribed age verification method to ensure that the people getting access are not under 18. That’s essentially the only due diligence defence. We’ll talk about government prescribed age verification methods shortly.</p><p>There’s another defence, which is “legitimate purpose”. </p><p>No organization shall be convicted of an offence under section 5 if the act that is alleged to constitute the offence has a “legitimate purpose related to science, medicine, education or the arts.” Maybe that will be interpreted broadly so that wikipedia articles related to art, reproduction and sexual health are not included. But it’s a defence, so it has to be raised after the person is charged. The onus is on the accused to raise it, not on the prosecution to take it into account at the time of laying a charge. </p><p>There’s also a defence that’s available if the organization gets a “Section 8” notice and complies with it. “What the heck are those?” you may ask. The bill has an “enforcement authority”, who I’m afraid will be the CRTC.</p><p>If they have reasonable grounds to believe that an organization committed an offence under section 5 (by allowing young people to access explicit materials), the enforcement authority may issue a notice to them under this section.</p><p>The notice names the organization, tells them they have reasonable grounds to believe they are violating the Act – but does not have to tell them the evidence of this. And they essentially get to order the organization to take “steps that the enforcement authority considers necessary to ensure compliance with this Act”. It doesn’t say “<b>THAT ARE NECESSARY</b>”, but what the enforcement authority thinks is necessary. </p><p>So the organization has twenty days to do all the things specified in the notice. They do get to make representations to the enforcement authority, but that doesn’t stop the clock. The 20 days keeps ticking. </p><p>Here’s where the rubber hits the road. </p><p>The “enforcement authority”, if they are not satisfied that the organization has taken the steps that the enforcement authority deems to be necessary, the enforcement authority gets to go to the Federal Court to get an order essentially blocking the site. Specifically, it says: “for an order requiring Internet service providers to prevent access to the sexually explicit material to young persons on the Internet in Canada.”</p><p>Any Internet service provider who would be subject to the order would be named as a respondent to the proceedings, and presumably can make submissions. But I can only think of one or two internet service providers who would do anything other than consent to the order, while privately cheering. </p><p>Take a look at this section, which sets the criteria for the issuance of an order.</p><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><p>(4) The Federal Court must order any respondent Internet service providers to prevent access to the sexually explicit material to young persons on the Internet in Canada if it determines that</p></blockquote><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><p>(a) there are reasonable grounds to believe that the organization that has been given notice under subsection 8(1) has committed the offence referred to in section 5;</p></blockquote><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><p>(b) that organization has failed to take the steps referred to in paragraph 8(2)(c) within the period set out in paragraph 8(2)(d); and</p></blockquote><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><p>(c) the services provided by the Internet service providers who would be subject to the order may be used, in Canada, to access the sexually explicit material made available by that organization.</p></blockquote></blockquote><div>It says the Court MUST issue the order – not MAY, but MUST, if there are reasonable grounds to believe that the organization committed the offence under the Act. It doesn’t require proof beyond a reasonable doubt, it doesn’t even require proof by a civil standard (being on a balance of probabilities or more likely than not), and it doesn’t even require actual belief based on evidence that an offence was committed. It requires only “reasonable grounds to believe.” </div><p>And it requires them to have not taken all the steps dictated by the enforcement authority within the extremely brief period of twenty days. </p><p>Finally, the order MUST issue if the court determines “the services provided by the Internet service providers who would be subject to the order MAY be used, in Canada, to access the sexually explicit material made available by that organization”.</p><p>That is a really, really low bar for taking a site off the Canadian internet. </p><p>But wait – there’s more!</p><p>The act specifically authorizes wide-ranging orders that would have the effect of blocking material that is not explicit and barring adult Canadians from seeking access to that same explicit material.</p><p>And if you look at the first sentence of subsection 5, it says “if the federal court determines that it is necessary to ensure that the sexually explicit material is not made available to young persons on the internet in Canada" it doesn't say anything about limiting the continuation of the offense or even tying it to the alleged offense set out in the notice. This is really poorly drafted and constructed.</p><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><p><b>Effect of order</b></p><p>(5) If the Federal Court determines that it is necessary to ensure that the sexually explicit material is not made available to young persons on the Internet in Canada, an order made under subsection (4) may have the effect of preventing persons in Canada from being able to access</p></blockquote><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><p>(a) material other than sexually explicit material made available by the organization that has been given notice under subsection 8(1); or</p></blockquote><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><p>(b) sexually explicit material made available by the organization that has been given notice under subsection 8(1) even if the person seeking to access the material is not a young person.</p></blockquote></blockquote><p>So, as we’ve seen, all of this hinges on companies verifying the age of users before allowing access to explicit material and the only substantial defence to the offence set out in the act is to use a government-dictated and approved “age verification method.” </p><p>We need to remember, adult Canadians have an unquestioned right to access just about whatever they want, including explicit material.</p><p>The criteria for approving an age verification method may be the only bright spot in this otherwise dim Act. And it’s only somewhat bright.</p><p>Before prescribing an age-verification method, the government has a long list of things they have to consider. </p>Specifically, the Governor in Council must consider whether the method<br /><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px; text-align: left;"><p>(a) is reliable;</p><p>(b) maintains user privacy and protects user personal information;</p><p>(c) collects and uses personal information solely for age-verification purposes, except to the extent required by law;</p><p>(d) destroys any personal information collected for age-verification purposes once the verification is completed; and</p><p>(e) generally complies with best practices in the fields of age verification and privacy protection.</p></blockquote><p>They just have to consider these. They’re not “must haves”, but good to haves. And there’s no obligation on the part of the government to seek input from the Privacy Commissioner. </p><p>So what’s the current state of age verification? It’s not uncommon to require a credit card, under the assumption that a person with a valid credit card is likely an adult. I’m not sure that’s the case any more and it may not be reliable. </p><p>There’s also ID verification, often coupled with biometrics. You take a photo of your government-issued ID, take a selfie, and software reads the ID, confirms you’re over 18 and compares the photo on the ID to the photo you’ve taken. That involves collecting personal information from your ID, which very likely includes way more information than is necessary to confirm your age. It involves collecting your image, and it involves collecting and using the biometrics from your selfie and your ID.</p><p>Do you really want to provide your detailed personal information, that could readily be used for identity theft or fraud, to a porn site? Or a third party “age verification service”?</p><p>One scheme was proposed in the UK a number of years ago, in which you would go to a brick and mortar establishment like a pub or a post office, show your ID and be given a random looking code. That code would confirm that someone reliable checked your ID and determined you were of age. Of course, this becomes a persistent identifier that can be used to trace your steps across the internet. And I can imagine a black market in ID codes emerging pretty quickly.</p><p>And there are some important things missing. For example, is it universally applicable? Not everyone has government-issued ID. Some systems rely on having a valid credit card. Not everyone has one, let alone a bank account. </p><p>The Bill’s sponsor and supporters say “smart people will come up with something” that is reliable and protects privacy. Why don’t we wait until we have that before considering passing a bill like this?</p><p>Let’s game this out with a hypothetical. Imagine, if you will, a massive online encyclopedia. It has thousands upon thousands – maybe millions – of articles, authored by thousands of volunteers. They cover the full range of subjects known to humanity, which of course includes reproduction and sexual health. A very small subset of the content they host and that their volunteers have created would fit into the category of “sexually explicit material”, but it is there, it exists and it is not age-gated. </p><p>The operators of this encyclopedia very reasonably take the view that their mission is educational and they’re entitled to the protection of the legitimate purpose defence that is supposed to protect “science, medicine, education or the arts”.</p><p>They also take the view that providing access to their educational material in Canada is protected by the Charter of Rights and Freedoms. And they also reasonably take the view that the Charter protects the rights of Canadians to access the content they produce. </p><p>But one day, a busy-body complains to the CRTC’s porn force that this online encyclopedia contains material that may be sexually explicit. The captain of the porn force drafts up a notice under Section 8, telling them that they must make sure that only people who have confirmed their age of majority via a government approved age verification technique can get access to explicit content. </p><p>The encyclopedia writes back and says “please let us know what is your criteria for judging whether something is published ‘for a sexual purpose’, as required in many parts of the definition.” Also, they say, their purpose is entirely educational, so they have a legitimate purpose. And they also mention the Charter. Meanwhile, 20 days pass by.</p><p>So the porn force makes an application in the Federal Court and serves notice on all the major internet service providers. None of the internet service providers show up at the hearing. The publishers of the encyclopedia hire a really good Canadian internet lawyer, who tells the court that the encyclopedia’s purpose is legitimate and related to education. And they’re likely not engaged in “commercial activity”. And cutting off access to the encyclopedia would be unconstitutional as a violation of the Canadian Charter of Rights and Freedoms. </p><p>The government lawyer, on behalf of the porn force, points to section 9(4) and says the court has no discretion to NOT issue the order if there are reasonable grounds to believe an offence has been committed and they didn’t follow the dictates set out in the Section 8 notice. </p><p>Even with the encyclopedia's information about their purposes, the bar of “reasonable grounds to believe” is so low that paragraph (a) is met. Since the encyclopedia didn’t follow the Section 8 order because they were sure they had a defence to the charge, paragraph (b) is met. And an order to all Canadian ISPs to block access to the encyclopedia would have the effect set out in paragraph (c). </p><p>Slam dunk. The Court must issue that order. But what about the fact that it would have the effect of cutting ALL Canadians off from the 99.999% of the site’s content that are not explicit? Tough. Paragraph (5) of Section 9 says that’s ok. No encyclopedia for you!</p><p>A Charter challenge would then be raised, and the whole thing would likely be declared unconstitutional as a violation of section 2(b) of the Charter that can’t be justified by section 1. </p><p>In short – even if you think this Bill is well intentioned – it is heavy handed, poorly constructed, doesn’t take freedom of expression into account and imagines that we can manufacture some magical fairy dust technology that will make the obvious privacy issues disappear. In short, it is a blunt instrument that imagines it’ll fix the problem. </p><p>And I should note that it will likely also have the effect of hurting older children who haven’t yet hit eighteen. The internet, its many communities and information repositories, are all critical for young people seeking legitimate information related to sexual health, sexual orientation and gender identity. Much of this information would fit into the broad definition of sexually explicit material, and it will be illegal for someone to allow them access via the internet. It will remain legal for them to get it in a bookstore or a library, but that’s not how young people generally access information in 2024. </p><p>I expect some supporters of this bill will be more than happy to see it limit Canadians’ right to access lawful material.</p><p>It’s good to see a discussion of this important issue. Even if you’re in favour of the objectives of this Bill, it is deeply, deeply problematic. It should be parked until there’s a way to deal with this issue without potentially violating the privacy rights and Charter rights of Canadians.</p><p><br /></p><div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-31993324259111766912023-12-20T19:16:00.002-04:002023-12-20T19:16:57.727-04:00How the Grinch Stole Privacy - A Privacylawyer Holiday Special<p><p> <iframe width="720" height="480" src="https://www.youtube.com/embed/hIPm4nte160" title="How the Grinch Stole Privacy - A Privacylawyer Holiday Special" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe><br /></p>
<p>I also had the oppotunity to talk about this silly take with CBC Information Morning Halifax and Cape Breton. You can listen to the interviews here: <a href="https://www.cbc.ca/listen/live-radio/1-27-information-morning-ns/clip/16030859-pleading-25th-david-fraser-legal-advice-favourite-christmas" target="_blank">Halifax</a>, <a href="https://www.cbc.ca/listen/live-radio/1-24-information-morning-cape-breton/clip/16031138-a-legal-retainer-grinch?share=true" target="_blank">Cape Breton</a>. <div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-16000799921026395882023-12-13T22:15:00.001-04:002023-12-13T22:15:34.221-04:00Federal Court concludes that a “virtual presence” in Canada is enough to be ordered to assist CSIS<p style="background-color: white; color: #242424; margin: 0px 0px 6pt 36pt; text-align: left;"><i><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: arial;">Decision follows trend starting in BC that a virtual presence in Canada is enough to be ordered to produce records</span></span></i></p><p style="background-color: white; color: #242424; margin: 0px 0px 6pt 36pt; text-align: left;"><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: arial;">The Federal Court of Canada, in connection with an application for a warrant and an assistance order under the <i>Canadian Security Intelligence Service Act</i>, was required to consider whether an assistance order under s. 22.3(1) of that Act could be issued to order a legal person with no physical presence in Canada to assist CSIS with giving effect to a warrant. The order would have extra-territorial effect.</span></span></p><p style="background-color: white; color: #242424; margin: 0px 0px 6pt 36pt; text-align: left;"><span style="font-family: arial;"><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;">In a redacted decision, </span><a data-auth="NotApplicable" data-linkindex="1" href="https://canlii.ca/t/k11kb" rel="noopener noreferrer" style="border: 0px; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;" target="_blank"><i><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;">Re Canadian Security Intelligence Service Act (Can)</span></i></a><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;">, the court concluded that it can, provided that the subject of the assistance order has a “virtual presence” in Canada. The decision notes that the foreign company involved was willing to assist, but needed to see a court order to manage their possible legal liability:</span></span></p><p style="background-color: white; color: #242424; margin: 0px 0px 6pt 72pt; text-align: left;"><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: arial;">[3] The affiant explained that [REDACTED] is incorporated and headquartered in [REDACTED] does not have physical offices or employees in Canada. It has a virtual presence in Canada that consists of [_some physical presence in Canada_]. It solicits business from Canadians and [REDACTED].</span></span></p><p style="background-color: white; color: #242424; margin: 0px 0px 6pt 72pt; text-align: left;"><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: arial;"> </span></span></p><p style="background-color: white; color: #242424; margin: 0px 0px 6pt 72pt; text-align: left;"><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: arial;">[4] The affiant also explained that [REDACTED] has been fully cooperative in providing assistance to CSIS to date, but has advised CSIS that it requires a judicial authorization from a Canadian court to minimize its legal risk in the event that CSIS uses the collected intelligence beyond analysis; [REDACTED]. [REDACTED] advised that it would continue to be cooperative pending and upon receipt of an Assistance Order.</span></span></p><p style="background-color: white; color: #242424; margin: 0px 0px 6pt 36pt; text-align: left;"><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: arial;">The company’s willingness to comply wasn’t particularly material to the Court’s decision.</span></span></p><p style="background-color: white; color: #242424; margin: 0px 0px 6pt 36pt; text-align: left;"><span style="font-family: arial;"><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;">At the urging of the government and largely supported by a court-appointed <i>amicus</i>, the Court followed a trend of cases that have dealt with similar questions but involving production orders under the <i>Criminal Code</i>. The first of these cases is </span><a data-auth="NotApplicable" data-linkindex="2" href="https://canlii.ca/t/hplpj" rel="noopener noreferrer" style="border: 0px; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;" target="_blank"><i><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;">British Columbia (Attorney General) v. Brecknell</span></i></a><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;">, where the Royal Canadian Mounted Police were seeking to obtain a production order naming Craigslist. As with this CSIS case, Craigslist said they’d cooperate but needed to see a court order. The British Columbia Court of Appeal, influenced by the </span><a data-auth="NotApplicable" data-linkindex="3" href="https://www.canlii.org/en/ca/scc/doc/2017/2017scc34/2017scc34.html" rel="noopener noreferrer" style="border: 0px; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;" target="_blank"><i><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;">Equustek</span></i></a><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"> case from the Supreme Court of Canada, concluded that a court has jurisdiction to issue a production order naming an entity physically beyond the court’s jurisdiction provided they had a “virtual presence” within the jurisdiction.</span></span></p><p style="background-color: white; color: #242424; margin: 0px 0px 6pt 36pt; text-align: left;"><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: arial;">The Court concluded:</span></span></p><p style="background-color: white; color: #242424; margin: 0px 0px 6pt 72pt; text-align: left;"><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: arial;">[49] I find that the jurisprudence in the context of production orders issued pursuant to section 487.014 of the <i>Criminal Code</i> provides a good analogy and support for finding that this Court has the jurisdiction to issue an Assistance Order where in personam jurisdiction can be established. The two provisions are similar in purpose, albeit in different contexts, both are directed to a person, which includes an organization or entity that is a legal person, and similar considerations arise in determining whether the order should be issued where the subject has only a virtual presence in Canada.</span></span></p><p style="background-color: white; color: #242424; margin: 0px 0px 6pt 72pt; text-align: left;"><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: arial;">[50] The considerations noted by the SCC in <i>Equustek</i> lend further support to taking an approach that reflects the realities of the internet dominated storage and transmission of documents and information. As noted in <i>Brecknell</i>, document control may exist in one jurisdiction, and the documents in another or in several others and “formalistic distinctions” between virtual and physical presence defeat the purpose of the legislation.</span></span></p><p style="background-color: white; color: #242424; margin: 0px 0px 6pt 72pt; text-align: left;"><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: arial;">[51] Whether an organization or entity with only a virtual presence in Canada can establish a real and substantial connection with Canada sufficient to constitute presence in Canada will be a case-by-case determination. Where such <i>in personam</i> jurisdiction is established, the organization or entity that is subject to the Assistance Order and required to provide documents in their possession or control is considered to be in Canada although the documents may be stored elsewhere.</span></span></p><p style="background-color: white; color: #242424; margin: 0px 0px 6pt 36pt; text-align: left;"><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: arial;">As with a number of the cases following <i>Brecknell</i>, the Court concluded that its ability to issue the order does not turn on whether it would be able to enforce the order, though that is a relevant consideration:</span></span></p><p style="background-color: white; color: #242424; margin: 0px 0px 6pt 72pt; text-align: left;"><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: arial;">[53] I have considered the issue of enforcement of the Assistance Order on [REDACTED]. I note that they have been cooperative to date and indicate their ongoing intention to cooperate. However, I also agree with the submissions of the AGC and <i>amicus</i> and the jurisprudence, that the enforcement of the Order is a separate issue from whether the Court has jurisdiction to issue the Order, but remains a relevant consideration with respect to whether the Order should be issued based on the particular circumstances.</span></span></p><p style="background-color: white; color: #242424; margin: 0px 0px 6pt 36pt; text-align: left;"><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-style: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: arial;">Consistent with the previous production order cases cited, the intended recipient was not a party to the hearing. All were <i>ex parte</i>, but some included <i>amici</i>.</span></span></p><p style="background-color: white; margin: 0px 0px 6pt 36pt; text-align: left;"><span style="border: 0px; color: inherit; font-feature-settings: inherit; font-kerning: inherit; font-optical-sizing: inherit; font-stretch: inherit; font-variant: inherit; font-variation-settings: inherit; font-weight: inherit; line-height: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: arial;"><span style="font-style: inherit;">Note: I believe that </span><i>Brecknell</i><span style="font-style: inherit;"> was wrongly-decided, but because all of these orders have not been </span><i>ex parte</i><span style="font-style: inherit;"> and unopposed, it'll be some time before these arguments will be made in court. See: </span></span></span><span style="background-color: transparent;"><span style="color: #242424; font-family: arial;">David T Fraser, "<a href="https://digitalcommons.schulichlaw.dal.ca/cjlt/vol18/iss1/5/" target="_blank">British Columbia (Attorney General) v. Brecknell", Case Comment</a>, (2020) 18:1 CJLT 135.</span></span></p><div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-18578020291741955942023-12-03T18:00:00.000-04:002023-12-03T18:00:00.128-04:00Being on the receiving end of a warrant from the Canadian Security Intelligence Service (CSIS)<p><iframe width="720" height="360" src="https://www.youtube.com/embed/kkyM3Rkxgxc" title="" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
<p><span face="Arial, sans-serif" style="font-size: 11pt; white-space-collapse: preserve;"><b>So someone from CSIS just called ….</b></span></p><span id="docs-internal-guid-51fcdc15-7fff-39c2-3e95-267a7b22eeea"><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhjD4pb-qQkOmJbFoMWh5V_-2lC9sL5kiruJzuqH_IWMe-KGYCpj34K9Rz5XVTwmn4l0L4CgGXNPBsPV-a4kDT32Va_zbxqHBFT7sDdG3pkw0j5TfCGuOnf9YUxegbmv1K7IJSvpU50owcbC5aTN4RAavgaq6l33xTpbmyW8XFbOHX7O8WChfPnVQ/s1024/DALL%C2%B7E%202023-11-30%2008.15.12%20-%20A%20Canadian%20spy%20in%20a%20sleek%20black%20suit,%20black%20gloves,%20and%20dark%20sunglasses,%20now%20additionally%20wearing%20a%20red%20and%20white%20striped%20toque,%20symbolizing%20Canada.%20H.png" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" data-original-height="1024" data-original-width="1024" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhjD4pb-qQkOmJbFoMWh5V_-2lC9sL5kiruJzuqH_IWMe-KGYCpj34K9Rz5XVTwmn4l0L4CgGXNPBsPV-a4kDT32Va_zbxqHBFT7sDdG3pkw0j5TfCGuOnf9YUxegbmv1K7IJSvpU50owcbC5aTN4RAavgaq6l33xTpbmyW8XFbOHX7O8WChfPnVQ/s320/DALL%C2%B7E%202023-11-30%2008.15.12%20-%20A%20Canadian%20spy%20in%20a%20sleek%20black%20suit,%20black%20gloves,%20and%20dark%20sunglasses,%20now%20additionally%20wearing%20a%20red%20and%20white%20striped%20toque,%20symbolizing%20Canada.%20H.png" width="320" /></a></div><br /><span face="Arial, sans-serif" style="font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"><br /></span><p></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span face="Arial, sans-serif" style="font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">There’s a first time for everything. You get a call from an “UNKNOWN NUMBER” and the caller says they work with Public Safety Canada and they’re looking for some information. This happens from time to time at universities, colleges, telecoms, internet-based businesses and others. Likely, they actually work for the Canadian Security Intelligence Service (known as CSIS) and they’re doing an investigation. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span face="Arial, sans-serif" style="font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">So what happens – or should happen – next? You should ask them what they’re looking for and what is their lawful authority. Get their contact information and then you should call a lawyer who has dealt with this sort of situation before. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span face="Arial, sans-serif" style="font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">CSIS is an unusual entity. They’re not a traditional law enforcement agency. While they can also get warrants (more about that later), they have a very different mission. The mandate of CSIS is to </span></p><br /><ul style="margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;"><li aria-level="1" dir="ltr" style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; list-style-type: disc; vertical-align: baseline; white-space: pre;"><p dir="ltr" role="presentation" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; text-wrap: wrap; vertical-align: baseline;">investigate activities suspected of constituting threats to the security of Canada (espionage/sabotage, foreign interference, terrorism, subversion of Canadian democracy);</span></p></li><li aria-level="1" dir="ltr" style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; list-style-type: disc; vertical-align: baseline; white-space: pre;"><p dir="ltr" role="presentation" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; text-wrap: wrap; vertical-align: baseline;">take measures to reduce these threats;</span></p></li><li aria-level="1" dir="ltr" style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; list-style-type: disc; vertical-align: baseline; white-space: pre;"><p dir="ltr" role="presentation" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; text-wrap: wrap; vertical-align: baseline;">provide security assessments on individuals who require access to sensitive government information or sensitive sites;</span></p></li><li aria-level="1" dir="ltr" style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; list-style-type: disc; vertical-align: baseline; white-space: pre;"><p dir="ltr" role="presentation" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; text-wrap: wrap; vertical-align: baseline;">provide security advice relevant to the Citizenship Act or the Immigration and Refugee Protection Act; and</span></p></li><li aria-level="1" dir="ltr" style="font-family: Arial, sans-serif; font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; list-style-type: disc; vertical-align: baseline; white-space: pre;"><p dir="ltr" role="presentation" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; text-wrap: wrap; vertical-align: baseline;">collect foreign intelligence within Canada at the request of the Minister of Foreign Affairs or the Minister of National Defence.</span></p></li></ul><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span face="Arial, sans-serif" style="font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">To carry out this mandate, CSIS may seek and obtain warrants. But they are unlike any warrant or production order you may see handed to you by a cop. CSIS warrants are more complicated to understand and possibly comply with than the more traditional law enforcement variety.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span face="Arial, sans-serif" style="font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">Canadians are often surprised to discover that we have a court that meets in secret, in a virtual bunker and hears applications for TOP SECRET warrants. These warrants can authorize “the persons to whom it is directed to intercept any communication or obtain any information, record, document or thing and, for that purpose, (a) to enter any place or open or obtain access to any thing; (b) to search for, remove or return, or examine, take extracts from or make copies of or record in any other manner the information, record, document or thing; or (c) to install, maintain or remove any thing.” These warrants can be accompanied by an assistance order, directing a person to assist with giving effect to a warrant. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span face="Arial, sans-serif" style="font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">A problem for third parties with these warrants is that they can be long-term and very open ended. The name of the target of the investigation may be unknown at the time the warrant was obtained, and the warrant may authorize the collection of data related to that unknown person. It can authorize the collection of information about people who are in contact with that unknown person. It may authorize the collection of additional information related to those persons, such as IP addresses, email addresses, communications and even real-time interception of communications. Once the unknown person has been identified by CSIS (by name, an account identifier, online handle, etc.), they will seek to obtain further information. But the warrant itself likely does not name the person or any account identifiers so that the custodian of information cannot easily connect the request to a particular information. And the recipient of the demand must be confident that they are authorized to disclose the requested information, otherwise they would be in violation of privacy laws. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span face="Arial, sans-serif" style="font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">To complicate things further, because these warrants are generally secret, CSIS is not willing to provide a copy of the complete warrant to a third party from whom they are seeking data. They will generally permit you to look at a redacted version of the warrant but will not let you keep it. Diligent organizations that know they can only disclose personal information if it is authorized and permitted by law, and they have a duty to ensure that they disclose only the responsive information. To do otherwise risks violating applicable privacy laws. Organizations should also document all aspects of the interaction and disclosure, which is a problem if you can’t get a copy of the warrant. Over time, procedures have been developed by CSIS and third party organizations to address this. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span face="Arial, sans-serif" style="font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;">While all of this may be TOP SECRET, nothing precludes a recipient of a warrant or an assistance order from seeking legal advice on how to properly and lawfully respond. Anyone dealing with such a situation should seek experienced legal advice. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-size: 14.6667px; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"><span face="Arial, sans-serif">In just the past few weeks, the Government of Canada launched a <a href="https://www.publicsafety.gc.ca/cnt/rsrcs/pblctns/2023-nhncng-frgn-nflnc-mnd-csis/index-en.aspx" target="_blank">consultation </a>on possible reforms to the CSIS Act, mainly under the banner of protecting Canadian democracy against foreign interference. Of course, changes to the statute will affect other aspects of their mission. The consultation is broadly organized under five “issues”, and it’s Issue #2 that is the most relevant to this discussion.
<br /></span></span></p></span><blockquote style="border: none; margin: 0px 0px 0px 40px; padding: 0px;"><span><p style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="font-size: 14.6667px; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"><span face="Arial, sans-serif"><b>Issue #2: Whether to implement new judicial authorization authorities tailored to the level of intrusiveness of the techniques</b></span></span></p></span></blockquote><span><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-size: 14.6667px; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"></span><span style="font-size: 14.6667px; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"><span face="Arial, sans-serif">
Essentially, what they’re proposing is a form of production order similar to what we have in the Criminal Code of Canada. Such an order would still be subject to court approval and could compel a third party to produce information “where CSIS has reasonable grounds to believe that the production of the information is likely to yield information of importance that is likely to assist CSIS in carrying out its duties and functions.” Examples they give are basic subscriber information, call detail records, or transaction records. These would be much more targeted and, in my view, much easier for the custodian of the information to evaluate and respond to.
A production order would authorize CSIS to obtain the basic subscriber information of a named person or known account identifier. Under the current warrant authority, those specific people may be unknown at the time the warrant was issued but are still within the ambit of the warrant. Presumably a CSIS production order can be served in the usual way as a criminal code production order and the company can keep a copy of it for its records.
I’m generally very skeptical about the expansion of intrusive government powers, particularly when much of it takes place outside of OPEN court but in a closed court, but I don’t see this as an expansion. CSIS can be given this ability, supervised by the court, to streamline its existing authorities.
They would need to be very careful if they were to purport to give it extraterritorial effect, since that would likely be very offensive to comity and the sovereignty of other countries. And intelligence collection is generally more offensive and aggressive than investigating ordinary crime. It may specifically be illegal under foreign law for the company to provide data in response to such an order. And I think the order should, like a criminal code production order, explicitly give the recipient the right to challenge it.
So that’s the current situation with CSIS investigations, at least from a service provider’s point of view, and a hint at what’s to come.
Again, if you find yourself in the uncomfortable and unfamiliar situation of taking a call from “public safety” or CSIS, reach out to get experienced legal advice from a lawyer who has been through the process before.
</span></span></p><div><span style="font-size: 14.6667px; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"><span face="Arial, sans-serif"><br /></span></span></div><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-size: 14.6667px; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"><span face="Arial, sans-serif"><br /></span></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span face="Arial, sans-serif" style="font-size: 11pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; font-variant-position: normal; vertical-align: baseline; white-space-collapse: preserve;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p></span><div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0Halifax, NS, Canada44.6475811 -63.572768316.337347263821151 -98.7290183 72.957814936178835 -28.4165183tag:blogger.com,1999:blog-6273930.post-55428978776481197252023-11-18T18:01:00.005-04:002024-01-04T18:05:38.511-04:00What is the "legitimate interests" exception to consent under Canada's proposed privacy law?<p><iframe width="720" height="480" src="https://www.youtube.com/embed/eyPsUg0f1aw" title="What is the "legitimate interests" exception to consent under Canada's proposed privacy law?" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe><p>So Bill c-27, also known as the digital charter implementation act of 2022 has been before Canada's Parliament for consideration for quite some time. Even before this parliamentary session, a bill substantially similar to the present one was tabled and then died on the order paper in the previous parliamentary session. After more than 20 years of the personal information protection electronic documents act, people have had a long time to think about improvements that perhaps could or should be made to our national privacy regime .</p><p>One thing that I've heard over and over again, particularly from privacy activists since 2018 is the suggestion that Canada should simply follow Europe's lead and implement a form of its general data protection directive. Privacy activists and others hail it as the “gold standard”. </p><p>Sometimes when I hear more from these folks, I realize that for some of them, it appears that all they know about the GDPR is the possibility of massive, company-ruining penalties. What they don't seem to understand is that it is relatively rare in Europe for a business to use consent as the basis for the collection, use or disclosure of personal information. This is in stark contrast to the current law, PIPEDA, where consent really is the only lawful basis for collecting, using and disclosing personal information. </p><p>Here is a case in point. It is an op-ed to the globe and mail written by the former co-CEO of research in motion, also known as blackberry, and more recently, the philanthropist behind Canada center for digital rights and the Centre for International Governance Innovation, Jim Balsillie. </p><p>In this op-ed, Balsillie “the EU's landmark general data protection regulation, a law that sets the baseline for modern protections around the world…”</p><p>He then goes on to viciously attack a portion of Bill c27 in the CPPA that is modeled directly on a provision from the GDPR: The ability for an organization to collect, use or disclose personal information without consent on the basis of legitimate interests .</p><p>Here is what Jim has to say in his op-ed. “ For example, the proposed new law creates a broad car vote for surveillance without knowledge or consent based on legitimate interests… there's worse, it's the businesses themselves that determine what constitutes legitimate interest for surveillance and they are under no obligation to tell the individual they are tracking and profiling them”</p><p>Look, either it is the gold standard or it is not.</p><p>And I really shouldn't have to tell a business leader that every one of us gets to decide how we comply with the law and if that assessment is incorrect, that is where enforcement comes in. The bill contains detailed information about what can be a legitimate interest in what cannot be a legitimate interest. Frankly, I am getting a little tired of this breathless hyperbole and want to set the record straight on what legitimate interests is and what it is not.</p><p>First, we'll look at the GDPR, then we will look at Bill c27.</p><p>Article 6 of the GDPR outlines the lawful bases for processing personal data. These include consent, contract, legal obligation, vital interests, public task, and legitimate interests. We’re going to zoom in on the last one – legitimate interests.</p><p>Legitimate interests are one of the more flexible lawful bases and probably the most-used. It is also the most open to interpretation. It allows data processing on the basis of the legitimate interests pursued by a data controller or a third party, unless such interests are overridden by the interests or fundamental rights and freedoms of the data subject.</p><p>This requires the data controller to carry out an analysis to see if “legitimate interests” can be used instead of another basis, such as consent. </p><p>To rely on legitimate interests, you must:</p><p>1. Identify a legitimate interest (be it commercial, individual, or societal benefits).</p><p>2. Show that the processing is necessary to achieve it.</p><p>3. Balance it against the individual’s interests, rights, and freedoms. This involves conducting a Legitimate Interests Assessment (LIA).</p><p>Legitimate interests can include network and information security, preventing fraud, direct marketing, and the like. </p><p>Using “legitimate interests” is not just carte blanche to do whatever you want. When invoking legitimate interests, the controller has to ensure transparency, adhere to data minimization principles, and implement safeguards to protect the rights of individuals. </p><p>The proposed Consumer Privacy Protection Act in Canada has a similar framework. Personally, I think it should be replaced with an almost word for word copy from the GDPR in order to remove – or at least reduce – unnecessary barriers for organizations that operate internationally.</p><p>But let's focus on what is in fact written in the bill as it currently exists.</p><p>In section 18(3), it says an organization may collect or use an individual's personal information without their knowledge or consent if the collection of use is made for the purpose of an activity in which the organization has a legitimate interest that outweighs any potential adverse effect on the individual resulting from that collection or use. And a reasonable person would expect the collection of use for such an activity. And the personal information is not collected or used for the purpose of influencing the individual’s behavior or decisions.</p><p>So like in Europe, it requires balancing that organization's interest against the interest of the individual. Unlike in Europe, it requires that the collection or use be for purposes that would essentially be obvious or expected by the individual. It is unclear what is the intended scope of that paragraph (b) there, since there are so many things that happen in the world that would reasonably be expected to alter somebody's behavior.</p><p>Subsection (4) sets a requirement that must be met prior to an organization relying on this legitimate interest for the collection or use of personal information. It says prior to collecting using personal information under subsection (3), the organization must identify any potential adverse effect on the individual that is likely to result from the collection or use, then identify and take reasonable measures to reduce the likelihood that the effects will occur or to mitigate or eliminate them, and comply with any prescribed requirements. That means that additional requirements could be set out in regulations to come.</p><p>Then it says in subsection (5) that the organization must record its assessment of how it meets the condition set out in subsection (4) and must, on request, provide a copy of the assessment to the Privacy Commissioner. </p><p>This doesn't, to me, sound like a completely arbitrary mechanism where organizations get to draw the line wherever they want. They have to document that decision-making and have to make it available to the privacy commissioner on request.</p><p>But that is not the end of it. Section 62 talks about what an organization has to include in its privacy statement to the public, and this says that they have to provide a general account of how the organization uses the personal information and how it applies the exceptions to the requirement to obtain an individual consent under this act, including a description of any activities referred to in subsection 18(3) in which it has a legitimate interest. </p><p>So this means that every organization that determines that it is appropriate to use legitimate interests for the collection or use of personal information has to document their decision making in a defensible manner, knowing that it could be presented to the Privacy Commissioner. And they don't get to do it sneakily as the breathless critics would have you think, because they have to publish it in black and white, plain language in their public facing privacy statement.</p><p>In addition to the legitimate interests basis for the collection or use of personal information, the proposed CPPA also includes certain categories of business activities for which personal information can be collected or used without an individual's knowledge or consent. This is in section 18, sub 1.</p><p>This says an organization may collect or use an individual's personal information without their knowledge or consent if the collection or use is made for the purpose of a business activity described in subsection (2). And a reasonable person would expect the collection or use for such an activity. And the personal information is not collected to use for the purpose of influencing the individual’s behavior or decisions. Does that sound familiar? This is a similar framework to what is in 18 sub 3. </p><p>This provision sets out what are the permissible business activities that fit within this exception. The first one is an activity that is necessary to provide a product or service that the individual has requested from the organization. It has to be necessary. Or it can be an activity that is necessary for the organization's information, system or network security. Or an activity that is necessary for the safety of a product or service that the organization provides. Or any other prescribed activity that could be set out in future regulations.</p><p>While I would like Canada’s version of “legitimate interests” to more closely parallel the one in the European General Data Protection Regulation, I think it is a completely reasonable addition to Canada’s privacy law. It requires a deliberate analysis and determination of whether it can be used and requires the organization to be transparent with its customers about the practice.</p><div><br /></div><div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-48775349126654430652023-05-08T12:25:00.002-03:002023-05-08T12:25:41.078-03:00British Columbia Privacy Commissioner shuts down facial recognition<p><span style="font-family: inherit;"><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen="" frameborder="0" height="480" src="https://www.youtube.com/embed/YMEFVjREVgU" title="British Columbia Privacy Commissioner shuts down facial recognition" width="720"></iframe>
</span></p><p><span style="font-family: inherit;"><br /></span></p><p><span style="font-family: inherit;"><br /></span></p><p><span style="font-family: inherit;">Recently, the information and privacy commissioner of British Columbia issued a decision that essentially shuts down most use of facial recognition technology in the retail context.</span></p><p><span style="font-family: inherit;">What’s interesting is that the Commissioner undertook this investigation on his own accord. In order to see how prevalent the use of facial recognition was among the province’s retailers, the OIPC surveyed 13 of the province’s largest retailers (including grocery, clothing, electronics, home goods, and hardware stores): 12 responded that they did not use FRT. The remaining retailer, Canadian Tire Corporation, requested that the OIPC contact their 55 independently owned Associate Dealer stores in the province. In the result, 12 stores reported using FRT. Based on these 12 responses, the Commissioner commenced an investigation under s. 36(1)(a) of the Personal Information Protection Act of four of the locations, scattered across the province. </span></p><p><span style="font-family: inherit;">What’s also interesting is that the stores immediately ceased use of the technology, but the Commissioner determined that doing a full investigation was warranted, so that retailers would be aware of the privacy issues with the use of facial recognition in this context. </span></p><p><span style="font-family: inherit;">The investigated stores used two different vendors’ systems, but they essentially operated the same way: The systems functioned took pictures or videos of anyone who entered the stores, as they came within range of the FRT cameras. This included customers, staff, delivery personnel, contractors, and minors who might have entered the store. Using software, the facial coordinates from these images or videos were mapped to create a unique biometric template for each face. So everyone was analyzed this way.</span></p><p><span style="font-family: inherit;">The systems then compared the biometrics of new visitors with those stored in a database of previously identified "Persons of Interest," who were allegedly involved in incidents such as theft, vandalism, harassment, or assault. When a new visitor's biometrics matched an existing record in the database, the FRT system sent an automatic alert to store management and security personnel via email or a mobile device application. The alerts contained the newly captured image or video that triggered the match, along with a copy of the previously collected image from the Persons of Interest database and any relevant comments or details about the prior incidents. According to store managers, these alerts were “advisory” until the match was confirmed in person by management or security personnel.</span></p><p><span style="font-family: inherit;">Store management reported that after a positive match was verified, the nature of the prior incident allegedly involving the individual helped determine a course of action. If a prior incident included violence, management or security staff would escort the individual from the store. If the prior incident involved theft, management may have chosen to surveil or remove the person in question</span></p><p><span style="font-family: inherit;">The legal questions posed by the Commissioner were (1) whether consent was required under PIPA for the collection and use of images for this purpose, (2) whether the stores provided notification and obtained the necessary consent (through signage or otherwise) and – most importantly – (3) whether this collection and use is for an “appropriate purpose” under s. 11 and 14 of PIPA.</span></p><p><span style="font-family: inherit;">The first question was easy to answer: Yes, consent is required in this context. PIPA, like PIPEDA, requires organizations to obtain consent, either explicitly or implicitly, before collecting, using, or disclosing personal information unless a specific exception applies. No such exceptions applied in this case. Therefore, the Commissioner concluded it was incumbent on the stores to show that individuals gave consent for the collection of their personal information. </span></p><p><span style="font-family: inherit;">How would you get that consent? Well the stores had signage at the entrances. Clear signage is usually sufficient for the use of surveillance cameras, but the question would be whether these would be sufficient for this use.</span></p><p><span style="font-family: inherit;">Store number 1 had a sign that stated, in part: “these premises are monitored by video surveillance that may include the use of electronic and/or biometric surveillance technologies.”</span></p><p><span style="font-family: inherit;">The Commissioner said this was inadequate. The notice did not state the purposes for the collection of personal information. Also, stating that biometric surveillance “may” be in use did not reflect that the store continuously employed the technology. The Commissioner said the average person cannot reasonably be expected to understand how their information may be handled by “biometric surveillance technologies,” let alone the implications and risks of this new technology. Consent requires that an individual understands what they are agreeing to – and the posted notification failed to adequately alert the public in this case, according to the Commissioner. This store failed to meet notification requirements under PIPA.</span></p><p><span style="font-family: inherit;">The second store had a notice that stated, in part: “facial recognition technology is being used on these premises to protect our customers and our business.” </span></p><p><span style="font-family: inherit;">This one was also not satisfactory to the Commissioner. The purpose, as set out, is so broad that the statement would relay no specific meaning to the average person. Furthermore, the notice does not explain what facial recognition technology entails or the nature of the personal information collected. One cannot reasonably assume that members of the public understand what FRT is, nor its privacy implications, according to the Commissioner.</span></p><p><span style="font-family: inherit;">Stores 3 and 4 had better notices, but they still didn’t satisfy the Commissioner. Their notices stated: “video surveillance cameras and FRT (also known as biometrics) are used on these premises for the protection of our customers and staff. These technologies are also used to support asset protection, loss prevention and to prevent persons of interest from conducting further crime. The images are for internal use only, except as required by law or as part of a legal investigation.” </span></p><p><span style="font-family: inherit;">It has more detail, but was not that well written. It does not say what “FRT” is. The commissioner noted that the abbreviation is not yet well-known or widely understood. Using the full phrase “facial recognition technology” along with a basic explanation of its workings would have provided a more accurate description of the stores’ data-collection activities. Even so, the Commissioner said that North American society is not yet at the point where it is reasonable to assume that the majority of the population understands what personal information FRT collects, or creates, as well as the technology’s privacy implications. All of this would have to be spelled out. </span></p><p><span style="font-family: inherit;">While you may be able to rely on implied consent for the use of plain old fashioned surveillance cameras, the Commissioner concluded that you cannot for facial recognition technology, at least in this context. </span></p><p><span style="font-family: inherit;">The Commissioner said facial biometrics are a highly sensitive, unique, and unchangeable form of personal information. Collecting, using, and sharing this information goes beyond what people would reasonably expect when entering a retail store, and using FRT creates a significant and lasting risk of harm. The Commissioner said the distinctiveness and permanence of this biometric data can make it an attractive target for misuse, potentially becoming a tool to compromise an individual's identity. In the wrong hands, the Commissioner wrote, this information can lead to identity theft, financial loss, and other severe consequences. (I am not entirely sure how…)</span></p><p><span style="font-family: inherit;">As a result, the four stores were required to obtain explicit consent from customers before collecting their facial biometrics. However, they did not make any attempts, either verbally or in writing, to obtain such consent.</span></p><p><span style="font-family: inherit;">So the notices were not adequate and the stores didn’t get the right kind of consent. But the last nail in the coffin for this use of biometrics was the Commissioner’s conclusion about whether the use of facial recognition technology for these purposes is reasonable. </span></p><p><span style="font-family: inherit;">Reasonableness is determined by looking at the amount of personal information collected, the Sensitivity of the information, the likelihood of being effective and whether less intrusive alternatives had been attempted.</span></p><p><span style="font-family: inherit;">With respect to the Amount of personal information collected, it was vast. The commissioner said a large quantity of personal information was collected from various sources, including customers, staff, contractors, and other visitors. The stores reported that their establishments were visited by hundreds of individuals of all ages, including minors, every day so during a single month, the FRT systems captured images of thousands of people who were simply shopping and not engaging in any harmful activities. The sheer volume of information collected suggests that the collection was unreasonable.</span></p><p><span style="font-family: inherit;">You won’t be surprised that the Commissioner concluded that the personal information at issue was super-duper sensitive. </span></p><p><span style="font-family: inherit;">With respect to the likelihood of being effective, they didn’t really have in place any system to measure it. The commissioner concluded it really wasn’t that effective. </span></p><p><span style="font-family: inherit;">The Commissioner wrote that before implementing new technology that collects personal information, organizations should establish a reliable method to measure the technology's effectiveness. This typically involves comparing relevant metrics before and after the technology's implementation. </span></p><p><span style="font-family: inherit;">However, in this case, the stores did not provide any systematic evidence of measuring their FRT system's effectiveness. Instead, they only gave anecdotal evidence of incidents before and after installation. Without a clear way to measure the technology's effectiveness, it is challenging to analyze this factor, particularly when collecting highly sensitive personal information.</span></p><p><span style="font-family: inherit;">The accuracy of FRT technology is also a related issue. Systems such as these have been reported widely to falsely match facial biometrics of people of colour and women. </span></p><p><span style="font-family: inherit;">The store managers acknowledged that the alerts could be inaccurate and relied on staff to compare database images to a visual observation of the individual. This manual check by staff suggests that the FRT system may not be effective. False identification can have harmful consequences when innocent shoppers are followed or confronted based on an inaccurate match.</span></p><p><span style="font-family: inherit;">Besides the system's accuracy, its effectiveness can also be judged against the existing methods used by the stores to identify potential suspects. The store managers stated that their security guards and managers typically knew the "bad actors" and could recognize them without FRT alerts. The persons of interest were often professional thieves who repeatedly returned to the store.</span></p><p><span style="font-family: inherit;">Moreover, there is little evidence that FRT enhanced customer and employee safety. Whether a person of interest was identified by FRT or by the visual recognition of an employee, the stores' next steps were the same. These involved deciding whether to observe the suspected person or interact with them directly, including escorting them from the premises. In either case, store managers rarely reported contacting the police for assistance.</span></p><p><span style="font-family: inherit;">As for whether less intrusive alternatives had been attempted, the less intrusive measures were what they were doing before. The Commissioner concluded that the use of FRT didn’t add a lot to solving the stores problems, but collected a completely disproportionate amount of sensitive personal information. The less intrusive means – without biometrics – largely did the trick. </span></p><p><span style="font-family: inherit;">In the end, the Commissioner made three main recommendations. </span></p><p><span style="font-family: inherit;">The first was that the stores should build and maintain robust privacy management programs that guide internal practices and contracted services. – presumably so they wouldn’t implement practices such as these that are offside the legislation. </span></p><p><span style="font-family: inherit;">This report also makes two recommendations for the BC government: The BC Government should amend the Security Services Act or similar enactments to explicitly regulate the sale or installation of technologies that capture biometric Information. </span></p><p><span style="font-family: inherit;">Finally, the BC Government should amend PIPA to create additional obligations for organizations that collect, use, or disclose biometric information, including requiring notification to the OIPC. This would be similar to what’s in place in Quebec where biometric databases need to be disclosed to the province’s privacy commissioner. </span></p><p><span style="font-family: inherit;">I think, for all intents and purposes, this shuts down the use of facial recognition technology in the retail context, where it is being used to identify “bad guys”. </span></p><p><br /></p><p></p><div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-44482257599225024442023-04-16T19:00:00.001-03:002023-04-17T07:21:35.472-03:00Privacy Commissioner of Canada Loses in Federal Court against Facebook<p><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen="" frameborder="0" height="480" src="https://www.youtube.com/embed/tepA9ZTvrs8" title="Privacy Commissioner of Canada loses in the federal court against Facebook re Cambridge Analytica" width="720"></iframe>
</p><p><br /></p><p>Just this past week, the Office of the Privacy Commissioner of Canada was on the receiving end of a <a href="https://drive.google.com/file/d/1JPyOpxCY9d-S6bWROGur53ZXK4xSMyFt/view?usp=sharing" target="_blank">Federal Court decision</a> that I would characterize as more than a little embarrassing for the Commissioner.</p><p>In a nutshell, the Commissioner took Facebook to court over the Cambridge Analytica incident and lost, big time.</p><p>You may recall from 2019, when the Privacy Commissioner of Canada and the Information and Privacy Commissioner of British Columbia released, with as much fanfare as possible, the result of their joint investigation into Facebook related to the Cambridge Analytica incident.</p><p>Both of the Commissioners concluded, at that time, that Facebook had violated the federal and British Columbia privacy laws, principally related to transparency and consent.</p><p>Because Facebook was not prepared to accept that finding, the Privacy Commissioner of Canada commenced an application in the Federal Court to have the Court make the same determination and issue a whole range of orders against the social media company.</p><p>The hearing of that application took place a short time ago and a decision was just released from the federal court this past week. It concluded that the Privacy Commissioner did not prove that Facebook violated our federal privacy law in connection with the Cambridge Analytica incident and made a few other interesting findings and observations. </p><p>Just a little bit of additional procedural information: under our current privacy law, the Privacy Commissioner of Canada does not have the ability to issue any orders or to levy any penalties. What can happen after the Commissioner has released his report of findings is that the complainant, or the Commissioner with the complaint’s okay, can commence an application in the federal court of Canada. This is what is called a de novo proceeding. </p><p>The finding from the privacy commissioner below can be considered as part of the record, but it is not a decision being appealed from. Instead, the applicant, in this case, the Privacy Commissioner, has the burden of proving to a legal standard that the respondent has violated the federal privacy legislation.</p><p>This has to be done with actual evidence, which is where the privacy commissioner fell significantly short in the Facebook case.</p><p>It has to be remembered that the events being investigated took place almost 10 years ago, and the Facebook platform is substantially different now compared to what it looked like. Then, if you were a Facebook user from that time, you probably remember a whole bunch of apps running on the Facebook platform. You probably were annoyed by friends who were playing Farmville and sending you invitations and updates. Well, these don't exist anymore. Facebook largely is no longer a platform on which third party apps will run.</p><p>In a nutshell, at the time, one of the app developers that used the Facebook platform was a researcher associated with a company called Cambridge Analytica. They had an app running on the platform called “this is your digital life”. It operated for some time in violation of Facebook's terms of use for app developers, hoovering up significant amounts of personal information and then selling and/or using that information for, among other things, profiling and advertising targeting. Here’s how the court described it:</p><p></p><blockquote><p>[36] In November 2013, Cambridge professor Dr. Aleksandr Kogan launched an app on the Facebook Platform, the TYDL App. The TYDL App was presented to users as a sort of personality quiz. Prior to launching the TYDL App, Dr. Kogan agreed to Facebook’s Platform Policy and Terms of Service. Through Platform, Dr. Kogan could access the Facebook profile information of every user who installed the TYDL App and agreed to its privacy policy. This included access to information about installing users’ Facebook friends. ...</p><p>[38] Media reports in December 2015 revealed that Dr. Kogan (and his firm, Global Science Research Ltd) had sold Facebook user information to Cambridge Analytica and a related entity, SCL Elections Ltd. The reporting claimed that Facebook user data had been used to help SCL’s clients target political messaging to potential voters in the then upcoming US presidential election primaries.</p><p></p></blockquote><p>One thing to note is that in 2008-2009, the OPC investigated Facebook and the Granular Data Permissions model that it was employing on their platform. Facebook said that the OPC sanctioned and expressly approved its GDP process after testing it after the conclusion of that investigation. They argued that the Commissioner should not be able to now say that a model it approved is inadequate. The Court didn’t have to go there. </p><p>In this application, the Privacy Commissioner alleged that Facebook failed to get adequate consent from users who used apps on Facebook’s platform, and failed to safeguard personal information that was disclosed to third party app developers. The Commissioner failed on both, but for different reasons. </p><p>In the court process, both the Commissioner and Facebook had the opportunity to put their best evidence and best arguments forward. Facebook was able to talk about their policies, their practices with respect to third party developers, and the sorts of educational material that they provided as part of their privacy program. </p><p>Ultimately, the court concluded that the Commissioner had failed to put forward strong evidence to lead to the conclusion that Facebook had not obtained adequate user consent for the collection, use and disclosure of their personal information when using the app in question, or apps more generally.</p><p>It’s interesting to me that the Court notes that the Commissioner did not provide any evidence of what Facebook could have done better, in their view, nor did it offer any expert evidence about what would have been reasonable to do in the circumstances. This is from paragraph 71 of the decision:</p><p></p><blockquote><p>[71] In assessing these competing characterizations, aside from evidence consisting of photographs of the relevant webpages from Facebook’s affiant, the Court finds itself in an evidentiary vacuum. There is no expert evidence as to what Facebook could feasibly do differently, nor is there any subjective evidence from Facebook users about their expectations of privacy or evidence that any user did not appreciate the privacy issues at stake when using Facebook. While such evidence may not be strictly necessary, it would have certainly enabled the Court to better assess the reasonableness of meaningful consent in an area where the standard for reasonableness and user expectations may be especially context dependent and are ever evolving.</p><p></p></blockquote><p>The Court also seems to be saying that the Commissioner was trying to suck and blow at the same time:</p><p></p><blockquote>[67] Overall, the Commissioner characterizes Facebook’s privacy measures as opaque and full of deliberate obfuscations, creating an “illusion of control”, containing reassuring statements of Facebook’s commitments to privacy and pictures of padlocks and studious dinosaurs that communicate a false sense of security to users navigating the relevant policies and educational material. On one hand, the Commissioner criticizes Facebook’s resources for being overly complex and full of legalize, rendering those resources as being unreasonable in providing meaningful consent, yet in some instances, the Commissioner criticizes the resources for being overly simplistic and not saying enough. </blockquote><p></p><p>The judge then found that Facebook was essentially asking the court to make a whole bunch of negative inferences in the absence of evidence, which they did not appear to try to obtain. Here’s the court at paragraph 72 of the decision: </p><p></p><blockquote><p>[72] Nor has the Commissioner used the broad powers under section 12.1 of PIPEDA to compel evidence from Facebook. Counsel for the Commissioner explained that they did not use the section 12.1 powers because Facebook would not have complied or would have had nothing to offer. That may be; however, ultimately it is the Commissioner’s burden to establish a breach of PIPEDA on the basis of evidence, not speculation and inferences derived from a paucity of material facts. If Facebook were to refuse disclosure contrary to what is required under PIPEDA, it would have been open to the Commissioner to contest that refusal.</p><p></p></blockquote><p>The judge then goes on to say at paragraph 77:</p><p></p><blockquote><p>[77] In the absence of evidence, the Commissioner’s submissions are replete with requests for the Court to draw “inferences”, many of which are unsupported in law or by the record. For instance, the Court was asked to draw an adverse inference from an uncontested claim of privilege over certain documents by Facebook’s affiant. </p><p></p></blockquote><p>I think there are a couple very important things to note here. The first is that the Privacy Commissioner’s report of findings, which was released with great fanfare and which concluded that Facebook had violated Canada's federal privacy laws, was essentially based on inadequate evidence. The court found it sadly lacking – not enough to convince the Court that it was more likely than not – but apparently this evidentiary record was entirely satisfactory for the purposes of the Commissioner’s investigation and report of findings.</p><p>The second thing to note here is that the court application was essentially the privacy commissioner's second kick at the can. More evidence could have been obtained for this hearing had they actually exercised their authorities under the legislation or under the rules of court. If they did that, they came to court with an inadequate evidentiary record.</p><p>The second main violation that was alleged by the Privacy Commissioner was that Facebook had failed to adequately safeguard user information that was disclosed to third party app developers. Essentially, the Privacy Commissioner's argument is that Facebook continues to have an obligation to safeguard all of the information even after a user has chosen to disclose that information to a third party app developer. Facebook took the view that the safeguarding obligation transferred to the app developer when the user initiated the disclosure to that app developer. </p><p>This is consistent with the scheme of the Act, in my view, because the responsibility to safeguard information and to limit its use falls on the organization that actually controls that information. Once it is given to an app developer for this purpose, it is under the control of that app developer and the obligation to safeguard it would rest with them.</p><p>The Court summarized the Commissioner’s argument on this point in paragraph 85:</p><p></p><blockquote><p>[85] The Commissioner counters that Facebook maintains control over the information disclosed to third-party applications because it holds a contractual right to request information from apps. The Commissioner maintains that Facebook’s safeguards were inadequate.</p><p>[86] I agree with Facebook; its safeguarding obligations end once information is disclosed to third-party applications. The Court of Appeal in Englander observed that the safeguarding principle imposed obligations on organizations with respect to their “internal handling” of information once in their “possession” (para 41). </p></blockquote><p></p><p>Very importantly here, though, is the statement from the court that companies can expect good faith and honesty in contractual agreements:</p><p></p><blockquote>[91] In any event, even if the safeguarding obligations do apply to Facebook after it has disclosed information to third-party applications, there is insufficient evidence to conclude whether Facebook’s contractual agreements and enforcement policies constitute adequate safeguards. Commercial parties reasonably expect honesty and good faith in contractual dealings. For the same reasons as those with respect to meaningful consent, the Commissioner has failed to discharge their burden to show that it was inadequate for Facebook to rely on good faith and honest execution of its contractual agreements with third-party app developers.</blockquote><p></p><p>This is the conclusion that the court reached. So, in the result, the court did not conclude that Facebook had violated PIPEDA in any way in association with the Cambridge analytica incident.</p><p>Another important observation, in my view, is that the Privacy commissioner of Canada did not actually investigate Cambridge Analytica itself, but focused all of its regulatory attention at Facebook. It is common ground that Cambridge Analytica and its principal violated Facebook's policies and developer agreements in taking user data off the platform and using it for secondary, unauthorized purposes. But they did not investigate Cambridge Analytica. They went after Facebook.</p><p>So what are the takeaways from this?</p><p>I think certain folks at the Office of the Privacy Commissioner should take an opportunity to think deeply about their approach to this entire thing. They should not be issuing flashy press releases and lobbing accusations in the way that they did without evidence that could support the allegations in a court of law. </p><p>I also think we need to think carefully about what this says for privacy law reform in Canada. The Commissioner at the time used his finding as an example of why he should be given order making powers and the powers to impose penalties. They even issued a handy-dandy table in which it concluded:</p><p>Because “Facebook disputed the validity of the findings and refused to implement the recommendations,” this should lead to the result that:</p><p></p><blockquote><p>“The Office of the Privacy Commissioner of Canada’s interpretation of the law should be binding on organizations. </p><p>To ensure effective enforcement, the Commissioner should be empowered to make orders and impose fines for non-compliance with the law.”</p></blockquote><p></p><p>Almost certainly, if he’d had those powers, he would have imposed orders and fines on Facebook, based on what the Court concluded was inadequate evidence. The Court even disagreed with the Commissioner’s interpretation of the law. </p><p>If we are going to have fines and orders under PIPEDA’s replacement, which seems inevitable, the OPC should NOT be in a position to impose them. The OPC should be the prosecutor, recommending any such fines or orders to a tribunal that will not show any deference to the Commissioner. </p><p>And finally, this offers some certainty that once information has been disclosed to a third party, it is the third party’s legal obligation to safeguard it. The OPC clearly thought that the obligation remained with the company where it originated, but that view was not shared with the court.</p><p>After the OPC filed its application in court, Facebook filed a judicial review application to have the whole thing thrown out. Facebook was not successful on that, mainly because they filed late and were not entitled to an extension. Regardless, there are some very interesting things in that decision, which I’ll discuss in an upcoming episode.</p><p><br /></p><p></p><div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-49998686781342243392022-12-18T16:02:00.001-04:002024-02-13T10:15:04.227-04:00Where to find me ...<p>Given the current dumpster fire at Twitter and the recent ban on outbound links to other social platforms, I thought I'd do a post of where to find me:
<ul><li>Twitter: <a href="https://twitter.com/privacylawyer">https://twitter.com/privacylawyer</a></li>
<li>Mastodon: <a href="https://twit.social/@privacylawyer">https://twit.social/@privacylawyer</a></li>
<li>YouTube: <a href="https://youtube.com/@privacylawyer">https://youtube.com/@privacylawyer</a></li>
<li>BlueSky: <a href="https://bsky.app/profile/privacylawyer.bsky.social" target="_blank">https://bsky.app/profile/privacylawyer.bsky.social</a></li>
<li>Threads: <a href="https://www.threads.net/@davidtsfraser" target="_blank">https://www.threads.net/@davidtsfraser</a>
<li>My firm, McInnes Cooper:<a href="https://www.mcinnescooper.com/people/david-fraser/"> https://www.mcinnescooper.com/people/david-fraser/</a></li>
<li>My blog, the Canadian Privacy Law Blog: <a href="https://blog.privacylawyer.ca">https://blog.privacylawyer.ca</a></li>
</ul><div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-24856213585255122622022-08-15T13:28:00.001-03:002022-12-28T13:33:03.368-04:00
<p><iframe width="720" height="480" src="https://www.youtube.com/embed/KQJuWrunUVs" title="Taking photos and recording videos in public places for personal purposes" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<p>Can someone legitimately try to stop you from taking photos or recording video in a public place? There are some laws to know about, but the answer for Canada is that you generally have the right to take photos or record video in a public place, and nobody can lawfully stop you from doing so. </p>
<h1>How it came up</h1>
<P>This past week on Twitter, I saw a couple of discussions about people taking photos in public places, either being called out about it online or being told in person to cut it out. </p>
<P>In the first example, Canadian journalist James MacLeod took it upon himself to get a radar speed gun and document people speeding through a park. He’d take photos of drivers and their speed, and post them on Twitter. One twitter user said doing so seemed “suspect”. </p>
<P>In the second example, a person in Toronto tweeted that he’d been told by a security guard to not take photos of a shipping container put in a public street, blocking a cycling lane. As I replied, “there is no legal basis upon which a security guard can require an individual private citizen to stop taking photos or video in a public place.”</p>
<P>I’ve previously done a video about recording the police in public (link below), but figured it was time to do a more general video about photography and videography in public. </p>
<P>Here’s the general rule: you can take photos in a public place or record video on public property without any legal consequences. That doesn’t always mean you should, but you generally can. You can also photograph or record any place or thing that is visible from a public place, which would include private property as long as you yourself are not trespassing.</p>
<P>There is nothing in our criminal law that makes it illegal to take photos or video in a public place. Other general laws are going to apply. You can’t be a nuisance, and you can’t damage property and you can’t obstruct the police when they are carrying out their duties. You can’t block traffic to get the perfect shot. Short of that, you can generally stand in a public place and take photos of everything and everyone you see. </p>
<P>In fact, you have a Charter right to take photos or record video. The right to freedom of expression protected in section 2(b) of the Charter also protects your right to collect information. Photography and videography are inherently expressive activities and are thus Charter-protected. Any limitation in law on that right would have to be justified under s. 1 of the Charter and any sort of blanket “no photography in public” law would not be justifiable. </p>
<h1>Exceptions – voyeurism</h1>
<P>That said, there is a crime of voyeurism that has a few nuances and can apply in public or quasi-public places. It was added to the Criminal Code relatively recently. </p>
<P>It involves surreptitiously observing or recording a person where there is a reasonable expectation of privacy. It has to be surreptitious and there has to be a reasonable expectation of privacy.</p>
<P>Paragraph (a) makes it an offence to observe or record in a place in which a person can reasonably be expected to be nude … or to be engaged in explicit sexual activity.</p>
<P>Paragraph (b) makes it an offence where the recording or observing is done for the purpose of observing or recording a person in such a state or engaged in such an activity.</p>
<P>Paragraph (c) covers a broader range of observation or recording, but where it is done for a sexual purpose. </p>
<P>People should be aware that the courts have held you can have a reasonable expectation of privacy in a relatively public place and that the expectation of privacy can vary according to the method of observation. For example, you may not have much of an expectation of privacy with regard to being observed by someone at eye level, but you may have a protected expectation of privacy from being observed or recorded up a person’s dress or from above to look down their top. </p>
<P>One of the leading cases on this is called Jarvis.</p>
<P>The accused was a teacher at a high school. He used a camera concealed inside a pen to make surreptitious video recordings of female students while they were engaged in ordinary school-related activities in common areas of the school. Most of the videos focused on the faces, upper bodies and breasts of female students. The students were not aware that they were being recorded. Of course, they did not consent to the recordings. A school board policy in effect at the relevant time prohibited the type of conduct engaged in by the accused. There were other official surveillance cameras in the school hallways. </p>
<P>The court said: </p>
<blockquote>“Given ordinary expectations regarding video surveillance in places such as schools, the students would have reasonably expected that they would be captured incidentally by security cameras in various locations at the school and that this footage of them could be viewed or reviewed by authorized persons for purposes related to safety and the protection of property. It does not follow from this that they would have reasonably expected that they would also be recorded at close range with a hidden camera, let alone by a teacher for the teacher’s purely private purposes (an issue to which I will return later in these reasons). In part due to the technology used to make them, the videos made by Mr. Jarvis are far more intrusive than casual observation, security camera surveillance or other types of observation or recording that would reasonably be expected by people in most public places, and in particular, by students in a school environment.”</blockquote>
<P>So while the students should have expected to be incidentally observed by the school’s cameras, that did not ultimately affect their expectation of privacy where a teacher with a hidden camera was concerned. He was convicted of voyeurism. </p>
<P>Another key element in the voyeurism offence is that it has to be surreptitious. In Jarvis, the camera was disguised in a pen. There is a case from Ontario called R. v. Lebenfish, 2014 ONCJ 130, in which a person was changed with voyeurism after he was observed taking photos, mainly of women, at a nude beach in Toronto. He was acquitted because he did not make any effort to hide what he was doing. The court also found that the other beach-goers did not have a reasonable expectation of privacy. The court did note that he wasn’t using a long zoom lens or other form of photographic enhancement. </p>
<P>Sneakily taking photos up dresses can be the offence of voyeurism, but standing on a sidewalk obviously taking a photo of someone else would not be. </p>
<P>In Lebenfish, the accused was also charged with mischief. Specifically, it was alleged he committed mischief “by willfully interfering with the lawful enjoyment without legal justification of property,” namely, the beach. </p>
<P>The court found that he did not interfere with the lawful enjoyment of the beach, but also noted that the answer may have been different if there were signs posted saying no photography or if there had been a municipal by-law prohibiting photography at the beach. If photography was prohibited, then part of the enjoyment of the beach would be that it was camera free. </p>
<P>One thing that is worth nothing is that the law doesn’t offer any special protection for children. A while ago, the police here in Halifax were looking for someone who was reported to have been taking photos of kids at a public park. That was followed by a lot of people saying that it is plainly illegal to take photos of other people’s children at a park. That’s not the case. It is certainly creepy and concerning, but likely not illegal in and of itself. </p>
<h1>Privacy laws</h1>
<P>What about other kinds of laws? We have privacy laws to think about. The ones I deal with most often regulate what businesses can do. An individual taking photos for personal purposes is not a business. </p>
<P>And just to be clear, they have carve-outs for personal use and artistic use. Here’s what PIPEDA says:</p>
<blockquote>(2) This Part does not apply to
<P>(b) any individual in respect of personal information that the individual collects, uses or discloses for personal or domestic purposes and does not collect, use or disclose for any other purpose; or</p>
<P>(c) any organization in respect of personal information that the organization collects, uses or discloses for journalistic, artistic or literary purposes and does not collect, use or disclose for any other purpose.</blockquote>
<P>The other provincial general privacy laws have similar exclusions. </p>
<h1>Privacy torts</h1>
<P>So what about the risk of being sued for damages for invasion of privacy. That’s not likely either. </p>
<P>In most common law provinces, you can sue or be sued for “intrusion upon seclusion”. </p>
<P>It is, in summary “an intentional or reckless intrusion, without lawful justification, into the plaintiff's private affairs or concerns that would be highly offensive to a reasonable person.”</p>
<P>If you poke into someone’s private life in a way that would be highly offensive, harm and damages are presumed. </p>
<P>You can also be sued for public disclosure of private facts, which also has to engage someone’s private life and be highly offensive to a reasonable person. </p>
<P>It is hard to see how taking photographs or video in a public place would engage someone’s private and intimate life, and be highly offensive to a reasonable person. It could be engaged if one were stalking someone, though. </p>
<h1>Statutory torts</h1>
<P>Some provinces have what are called statutory torts of invasion of privacy. </p>
<P>Here is the gist of the British Columbia Privacy Act. </p>
<P><blockquote>1(1) It is a tort, actionable without proof of damage, for a person, wilfully and without a claim of right, to violate the privacy of another.</blockquote></p>
<P>Note the violation has to be without a claim of right or legitimate justification. </p>
<P>It then goes on and says …</p>
<blockquote>(2) The nature and degree of privacy to which a person is entitled in a situation or in relation to a matter is that which is reasonable in the circumstances, giving due regard to the lawful interests of others.</p>
<P>(3) In determining whether the act or conduct of a person is a violation of another's privacy, regard must be given to the nature, incidence and occasion of the act or conduct and to any domestic or other relationship between the parties.</blockquote></p>
<P>Note it specifically refers to eavesdropping and surveillance in subsection (4), which reads:</p>
<P><blockquote>(4) Without limiting subsections (1) to (3), privacy may be violated by eavesdropping or surveillance, whether or not accomplished by trespass.</blockquote></p>
<P>Again, it is hard to see how obviously taking photographs or video in a public place would engage this tort, but it could be engaged if one were stalking someone.</p>
<P>Private property but public places</p>
<P>Regularly, we go to places where the public is generally invited, but it is private property. This can also include what we often think of as being “public property”, but it is owned by someone else. Think of a park, which is owned by a municipality. People or organisations that own property can put conditions on entry to that property. One of those conditions may be “no photography”. And if you exceed or violate the conditions of your invitation, you could then be trespassing. The property owner would be within their rights to ask you to leave under provincial trespassing statutes. In some provinces, it may be a provincial summary offence. But the owner or occupier of the property would have to put you on notice that photography is prohibited on the premises. </p>
<h1>Requests to delete photos</h1>
<P>Finally, I’m sometimes asked if you can be required to delete photos taken. The answer is a resounding no. No private individual can take your phone and nobody can require you to delete any photos. </p>
<div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-61561424140925071702022-08-08T09:30:00.002-03:002022-08-08T09:30:00.225-03:00Video: OPC Finding: Spam messages sent by COVID testing contractor
<p><iframe width="720" height="480" src="https://www.youtube.com/embed/e75UlPhSX5o" title="" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<P>The Privacy Commissioner of Canada just released a <a href="https://www.priv.gc.ca/en/opc-actions-and-decisions/investigations/investigations-into-businesses/2022/pipeda-2022-002/" target="_blank">report of findings</a> about a company contracted by the Airport of Montreal to do on-arrival covid testing. The company added the people tested to their mailing list and sent them unsolicited commercial electronic messages. The investigation was done jointly with the Information Commissioner of Quebec. The finding raises more questions than it answers. </P>
<P>The complainant in this case arrived at Montreal’s Trudeau International Airport. To comply with the Public Health Agency of Canada’s rules, the individual had to undergo on-arrival COVID testing. Conveniently, the Airport had contracted with a company called Biron Health Group to COVID testing directly at the airport. So the complainant went to the Biron site, provided them with his contact information, had this test done, it was negative and they emailed him the results. </P>
<P>A few days after receiving his test results, the complainant received an email from Biron promoting its other services. The complainant unsubscribed using the link in the email, and never received any further unwanted emails from them. The OPC said “he was shocked to receive such an email” and filed a complaint with the OPC. </P>
<P>The information and privacy commissioner of Quebec also investigated, but does not appear to have released a decision on the case. Instead, they <a href="https://www.cai.gouv.qc.ca/biron-groupe-sante-inc-cesse-lenvoi-de-courriels-promotionnels-aux-voyageurs-ayant-passe-un-test-de-depistage-de-la-covid-19/" target="_blank">just referred to the OPC’s finding</a>. </P>
<P>During the course of the investigation, the company said it had “implied consent” under Canada’s Anti-Spam Law to send commercial electronic messages and was justified in doing so. </P>
<P>The OPC said there was no implied consent under PIPEDA, however. Here’s what they said specifically:</P>
<blockquote>“The OPC is of the opinion that Biron could not reasonably assume that it had the implicit consent of travellers arriving in Canada. Biron was mandated by the government to conduct COVID-19 testing on travellers and paid by the Montreal Trudeau Airport. Biron was the only company offering this service at this airport. Consequently, travellers arriving in Canada had no choice but to do business with Biron to comply with the rules issued by the Public Health Agency. In this situation, these travellers would not normally expect their personal information to be used for reasons other than the mandatory testing.</P>
<P>Biron collected the travellers’ personal information for the purpose of conducting COVID-19 tests and sending them sensitive information related to their health, notably their test results. Biron was acting as a service provider for the airport. The OPC considers that Biron should have taken these circumstances into account before using the personal information for secondary marketing purposes and for its own purposes.”</blockquote>
<P>Because Biron said they’d stop doing this, the OPC closed the file as “settled during the course of the investigation”. Case closed.</P>
<P>So why is this unsatisfying? There are a couple of key questions in the background, of interest to privacy practitioners, that are unaddressed and thus unanswered.</P>
<P>The first question is what law should actually apply to Biron in this case? The Privacy Commissioner refers to PIPEDA, our federal commercial privacy law. But we have a mess of privacy laws in Canada, more than a few of which could have been applicable. </P>
<P>Quebec has a provincial privacy law that applies to all businesses in that province, unless they are “federal works, undertakings or businesses”. Notably, international airports and airlines are “federal works, undertakings or businesses.”</P>
<P>There really is no doubt that if the testing facility had been off the airport property and operating on its own, the federal privacy Law could not have applied at all and instead the Quebec private sector privacy law would have been applicable. That means the federal Commissioner would have had no jurisdiction to investigate and it would have been entirely up to the Quebec Commissioner to do so.</P>
<P>So does that mean that simply being on or operating from airport property makes you a “federal work, undertaking or business”? I don't think that can really be the case.</P>
<P>Was it because the service they were providing is connected to international travel that places them within Federal jurisdiction? That seems dubious to me.</P>
<P>Were they within Federal jurisdiction because they had been engaged by the airport authority to provide this service? The airport authority is certainly a “federal work, undertaking or business”, but does that mean all of its contractors become “federal works, undertakings or businesses”? Again, I don't think that can really be the case. Would a taxi company given a concession to serve the airport automatically come under federal jurisdiction? </P>
<P>They were performing a function that was required by the Public Health Agency of Canada, but PHAC is subject to the federal Privacy Act, which never came up in the commissioner's report of findings.</P>
<P>This would be more tricky in a province like Alberta, where there is a provincial general privacy law that excludes PIPEDA and a health privacy law that does not. (Quebec doesn’t have a health-specific privacy law.)</P>
<P>Now, it may well be that both the federal and the Quebec Commissioners thought they didn't even have to consider jurisdiction because they got the result they were looking for during the course of the investigation: the company said they would change their practices and what might have been problematic under either the Quebec or the federal law has ceased. This seems likely to me, as in my experience the federal Privacy Commissioner's office we'll bend over backwards to avoid making any statements related to their jurisdiction that could come back to haunt them later.</P>
<P>This is not just a privacy nerd question, because other things turn on whether a company is a “federal work, undertaking or business”. If Biron is in that category, then provincial labour and employment laws don’t apply to that workplace. Instead, the Canada Labour Code applies. Other federal laws would also suddenly apply to them, not just our privacy law. If I was this company, I’d be left scratching my head.</P>
<P>The second element of this that is problematic is the interaction between our privacy laws and Canada's anti-spam law, also known as CASL. You will recall that the company said that they were justified in sending commercial electronic messages because they had an “existing business relationship” with the people who underwent testing. The Privacy Commissioner really did not address that, but instead focused on the Personal Information Protection and Electronic Documents Act which requires consent for all collection, use and disclosure of personal information. That consent can be implied, particularly where it would be reasonable for the individual to expect that their information will be used for a particular purpose in light of the overall transaction. The Commissioner found that individuals would not expect to have their personal information used for the secondary purpose and therefore there was no implied consent under PIPEDA.</P>
<P>But that is contrary to the express scheme of Canada's anti-spam law. Under CASL, an organization can only send a commercial electronic message to a recipient where it has consent to do so. That consent either must be express or implied. Implied consent under CASL is very different from implied consent under PIPEDA. CASL doesn't care about what the consumer's expectation might be. Consent can be implied where there is an existing business relationship. One of the possible existing business relationships is the purchase of goods or services from the organization in the previous two years. Presumably, buying a COVID test from a vendor would meet that threshold and there would be implied consent for sending commercial electronic messages. I do agree with the federal Privacy Commissioner that doing so because you are ordered to by the Public Health Agency of Canada would really be contrary to the individual's expectation. </P>
<P>But this really does highlight some of the absurd dissonance between our anti-spam law and our privacy law. Both use the term “implied consent”, but it means radically different things. From this finding from the federal Commissioner, it appears that he is of the view that implied consent under CASL does not lead to deemed implied consent under PIPEDA. CASL expressly permits it, but PIPEDA does not.</P>
<P>When it comes to consent for sending commercial electronic messages, one would think that the piece of legislation that was expressly written and passed by Parliament for that purpose would be the final say, but the OPC certainly does not seem to be of that view.</P>
<P>The Privacy Commissioner carried out this investigation along with the Quebec commissioner, but there is no mention of whether the CRTC, which is the regulator under CASL, was involved.</P>
<P>At the end of the day, I think an existing business relationship was created between the complainant and the company so that there would have been implied consent to send commercial electronic messages, regardless of whether the consumer would have expected it to do so. The Commissioner did highlight that the individual had to be tested under the rules for the Public Health Agency of Canada, leaving room to argue that had the individual gone to the company for a test for other purposes, that might have been a more direct commercial relationship between the parties.</P>
<P>As my friend and tech law colleague Jade Buchanan pointed out on Twitter, “CASL is completely unnecessary when PIPEDA will apply to the use of personal information (name email, etc.) to send commercial electronic messages.” Personally, I think that one of the reasons why we have CASL is because PIPEDA was seldom enforced by the OPC against spammers when clear jurisdiction to do so existed for more than a decade before CASL was created. </P>
<blockquote class="twitter-tweet"><p lang="en" dir="ltr">This finding confirms two things:<br><br>1. CASL compliance doesn't guarantee PIPEDA compliance. That is, of course, ridiculous. Implied consent to send a CEM under CASL should be implied consent for the associated use of personal information under PIPEDA. <a href="https://t.co/o404EHUsfV">https://t.co/o404EHUsfV</a></p>— Jade Buchanan (@Jade_Buchanan) <a href="https://twitter.com/Jade_Buchanan/status/1555233237844107264?ref_src=twsrc%5Etfw">August 4, 2022</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
<P>And there’s nothing in the pending Consumer Privacy Protection Act that would address this dissonance between our privacy and spam law. </P>
<P>So that is the finding, and we're left scratching our heads a bit or at least have unanswered questions about important matters of jurisdiction and the intersection between our privacy laws and our spam laws.</P>
<div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-52745952257732164162022-06-27T10:23:00.004-03:002022-07-29T11:20:37.666-03:00Video: Preparing for Canada's new Consumer Privacy Protection Act<p><iframe width="720" height="480" src="https://www.youtube.com/embed/APyu_hXIKYk" title="Planning for Canada's new Consumer Privacy Protection Act" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<P>The government of Canada tabled the Digital Charter Implementation Act, 2022 in the week before parliament rose for their summer break. While this is in limbo, what, if anything, should Canadian businesses be doing to prepare for the Consumer Privacy Protection Act?</P>
<P>In the week before the summer break, the Industry Minister tabled in parliament the Digital Charter Implementation Act, which will overhaul Canada’s federal private sector privacy law. It has been long anticipated and for many, long overdue. With parliamentarians off the for the summer, what can we expect and what should businesses be doing to get ready for it?</P>
<P>I expect that when the house resumes, the bill will be referred to either the Standing Committee on Industry, which is where PIPEDA went more than 20 years ago, or to the Standing Committee on Access to Information, Privacy and Ethics. </P>
<P>I have to say that the current government is very unpredictable. When Bill C-11 was tabled in 2019 for the Digital Charter Implementation Act of 2019, the bill just sat there with no referral to committee and it seemed to not be a priority at all. If they are serious about privacy reform, they should get this thing moving when they are back in session.</P>
<P>When it gets to committee, the usual cast of characters will appear to provide comments. First up will be the minister of Industry and his staff. Then will be the privacy commissioner of Canada, who will only have had a few months in his office at that point. I would not be surprised to see provincial privacy commissioners have their say, and maybe even data protection authorities from other countries. Then industry and advocacy groups will have their say. </P>
<P>The Commissioner in 2019 was very critical of the C-11 version of the bill, and it appears that most of his suggestions have gone unheeded. I expect that between 2019 and now, there has been a lot of consultation and lobbying going on behind the scenes that resulted in the few changes between C-11 and C-27. It will be interesting to see how responsive the committee and the government are to making changes to the bill.</P>
<P>I would not be surprised to see this bill passed, largely in its current form, before the end of the year. But even if it speeds though the House of Commons and the Senate, I do not expect that we will see this law in effect for some time. In order for the Consumer Privacy Protection Act and the Personal Information and Data Protection Tribunal Act to be fully in force, the government will have a lot of work to do.</P>
<P>The biggest effort will be standing up the new tribunal under the Personal Information and Data Protection Tribunal Act. Doing so will not be a trivial matter. At least three members have to be recruited, and at least three of those have to have expertise in privacy and information law. They’ll need offices, staff, a registry, IT infrastructure, then they’ll need to make their rules of procedure. I can’t see that taking any less than a year, even if the government is currently informally recruiting for those roles. </P>
<P>An example I’d look at is the College of Patent Agents and Trademark Agents, which was established pursuant to a bill passed in December 2018 and came into force on June 28, 2021. Essentially, it took two and a half years between the passing of the bill and when the College was open for business. The college was probably more complicated to set up than the tribunal, but it provides some insight I think. </P>
<P>Personally, I don’t think the CPPA can be phased in without the tribunal operating as a going concern. There are transitional provisions related to complaints that are being dealt with by the Commissioner prior to the coming into force of the CPPA, but otherwise the existence of the tribunal is essential to the operation of the CPPA and the Commissioner’s mandate. </P>
<P>So if I had to look into my crystal ball, I don’t think we’ll see this fully in effect for at least a year and a half. </P>
<P>So should companies be doing anything now? I think so. When the CPPA and the Tribunal Act come into effect they will be fully in effect. In addition to making your politicians aware of any concerns you have, companies should be looking very closely at their current privacy management program – if any – to determine if it will be up to snuff.</P>
<P>Section 9 of the Act says that “every organization must implement and maintain a privacy management program that includes the policies, practices and procedures the organization has put in place to fulfill its obligations under this Act, including policies, practices and procedures respecting</P>
<blockquote>(a) the protection of personal information;
(b) how requests for information and complaints are received and dealt with;
(c) the training and information provided to the organization’s staff respecting its policies, practices and procedures; and
(d) the development of materials to explain the organization’s policies and procedures.”</blockquote>
<P>It then says “In developing its privacy management program, the organization must take into account the volume and sensitivity of the personal information under its control.”</P>
<P>This is, of course, very similar to the first principle of the CSA Model Code that’s in PIPEDA. But section 10 of the CPPA says the Commissioner can ask for it and all of its supporting documentation at any time. </P>
<P>I can imagine the OPC sending out requests for all of this documentation to a huge range of businesses shortly after the Act comes into force. </P>
<P>So what does a privacy management program include? If of course includes your publicly-facing privacy statement described in section 62. What has to be in this document will change a lot compared to PIPEDA. It has to explain in plain language what information is under the organization’s control, a general account of how it uses that personal information. </P>
<P>If the organization uses the “legitimate interest” consent exception, the privacy statement has to include a description of that. If the organization uses any automated decision system to make predictions, recommendations or decisions about individuals that could have a “significant impact on them”, that has to be described. It also has to say whether or not the organization carries out any international or interprovincial transfer or disclosure of personal information that may have reasonably foreseeable privacy implications. You also have to state the retention periods applicable to sensitive personal information, then explain the process for questions, complaints, access requests and requests for deletion. Most privacy statements don’t currently include all this information. </P>
<P>You need to assess what personal information you have, where it is, who has it, who has access to it, what jurisdiction is it in or exposed to, how it is secured, when did you collect it, what were the purposes for that collection, are there any new purposes, and have those purposes expired.</P>
<P>A good starting point for your privacy management program is to document all the personal information under the organizations’ control and the purposes for which it is to be used. Section 12(3) of the CPPA requires that this be documented. You will also need to ensure that all of these purposes are appropriate using the criteria at section 12(2). </P>
<P>You’ll also want to review whether any of the consent exceptions related to business activities under 18(1) or legitimate interests in section 18(3) could be applicable, and document them. </P>
<P>Under s. 18(4), this documentation will have to be provided to the Commissioner on request. </P>
<P>You will also need to document the retention schedule for all of your personal information holdings, and make sure they are being followed. And remember, all information related to minors is deemed to be sensitive and the retention schedule for sensitive information has to be included in your privacy statement. </P>
<P>Next, you’ll want to inventory and document all of your service providers who are collecting, using or disclosing personal information on your behalf. You’ll need to review all of the contracts with those service providers to make sure the service provider provides the same level of protection equivalent to original controlling organizations’ obligations. It should be noted that service providers, in the definition in the Act, expressly includes affiliated companies. So you’ll need to make sure that intercompany agreements are in place to address any personal information that may be transferred to affiliates. </P>
<P>You’ll want to check your processes for receiving questions, complaints and access requests from individuals. You may need to tweak your systems or processes to make sure that you can securely delete or anonymise data where required. </P>
<P>And last, but certainly not least, you’ll want to look very closely at your data breach response plans. It needs to identify all suspected data breaches, make sure they are properly escalated and reviewed. Any breach itself of course has to be stopped, mitigated and investigated. The details will need to be recorded and you’ll also want to think about the processes for getting legal advice at that stage so information you may want to keep privileged will be protected and you can understand your reporting and notification obligations. </P>
<P>At the end of the day, the CCPA is not a radical departure from the existing framework of PIPEDA. It requires greater diligence and what we in the privacy industrial complex call “privacy maturity”. Even if it didn’t, the significant penalties and the cost of dealing with investigations and inquiries by the commissioner and possible hearings before the tribunal should be enough to convince organizations to up their privacy games. </P>
<div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-65187021353435772802022-06-20T08:55:00.005-03:002022-07-29T09:00:58.202-03:00Video: An overview of the Digital Charter Implementation Act, 2022<p><iframe width="720" height="480" src="https://www.youtube.com/embed/yjIaI1KMMdY" title="What's in Canada's new federal privacy law? An overview of the Digital Charter Implementation Act" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<p>Finally, the government of Canada has tabled its long-awaited privacy law, intended to completely overhaul Canada’s private sector privacy law, and rocket the country to the front of the pack for protecting privacy. Not quite, but I’ll give you an overview of what it says.</P>
<h1>Highlights</h1>
<P>On June 26, 2022, the Industry Minister François Philippe Champagne finally tabled in the House of Commons <a href="https://www.parl.ca/legisinfo/en/bill/44-1/c-27" target="_blank">Bill C-27, called the “Digital Charter Implementation Act, 2022”</a>. This is the long-awaited privacy bill that is slated to replace the Personal Information Protection and Electronic Documents Act, which has regulated the collection, use and disclosure of personal information in the course of commercial activity in Canada since 2001. </P>
<P>PIPEDA, contrary to what Minister Champagne said at the press conference later that day, has been updated a number of times but there really has been a broad consensus that it was in need of a more general overhaul.</P>
<P>The bill is very similar to Bill C-11, which was tabled in 2019 as the Digital Charter Implementation Act, 2019, and which languished in parliament until dying when the federal government called the last election. </P>
<P>The bill creates three new laws. The first is the Consumer Privacy Protection Act, which is the main privacy law. The second is the Personal Information and Data Protection Tribunal Act and the third is the Artificial Intelligence and Data Act, which I’ll have to leave to another episode.</P>
<P>I don’t plan to do a deep dive into the bill in this video, as I want to spend more time poring over its detailed provisions. We can’t just do a line-by-line comparison with PIPEDA, as the Bill is in a completely different structure than PIPEDA. You may recall that PIPEDA included a schedule taken from the Canadian Standards Association Model Code for the Protection of Personal Information. The statute largely said “follow that”, and there are a bunch of provisions in the body of the Act that modify those standards or set out how the law is overseen.</P>
<P>The most significant difference is what many privacy advocates have been calling for: the Privacy Commissioner is no longer an ombudsman. The law includes order-making powers and punitive penalties. The Bill also creates a new tribunal called the Personal Information and Data Protection Tribunal, which replaces the current role of the Federal Court under PIPEDA with greater powers. </P>
<P>Other than order making powers, I don’t see much of a difference between what’s required under the new CCPA and what diligent, privacy-minded organizations have been doing for years.</P>
<P>This is a high-level overview of what’s in Bill C-27, and I’ll certainly do deeper dives into its provisions in later videos.</P>
<H1>Does the law apply any differently?</h1>
<P>PIPEDA applied to the collection, use and disclosure of personal information in the course of commercial activity and to federally-regulated workplaces. That hasn’t changed, but a new section 6(2) says that the Act specifically applies to personal information that it collected, used or disclosed interprovincially or internationally. The privacy commissioner had in the past asserted that this was implied, but it was never written in the Act. Now it will be. Two things about that are problematic: the first is that it’s not expressly limited to commercial activity, so there’s an argument that could be made that it would apply to non-commercial or employee personal information that crosses borders. The second dumb thing is that this means that a company with operations in British Columbia and Alberta, when it moves data from one province to another not only has to comply with the substantially similar privacy laws of each province, now they have to comply with the Consumer Privacy Protection Act. That seems very redundant.</P>
<P>It includes the same carve-outs for government institutions under the Privacy Act, personal or domestic use of personal information, journalistic, artistic and literary uses of personal information and business contact information. </P>
<P>We really could have benefitted from a clear extension of the Act to personal information that is imported from Europe so we can have confidence that the adequacy finding from the EU, present and future, really applies across the board.</P>
<P>It does have an interesting approach to anonymous and de-identified data. It officially creates these two categories. It defines anonymize as: “to irreversibly and permanently modify personal information, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly, by any means.” So there effectively is no reasonable prospect of re-identification. To de-identify data means “means to modify personal information so that an individual cannot be directly identified from it, though a risk of the individual being identified remains.” You’re essentially using data with the identifiers removed. </P>
<P>The legislation does not regulate anonymous data, because there is no reasonable prospect of re-identification. It does regulate de-identified data and generally prohibits attempts to re-identify it. The law also says that in some cases, de-identified data can be used or even has to be used in place of fully identifiable personal information. </p>
<H1>What happened to the CSA model code?</H1>
<P>When you look at the CCPA, you’ll immediately see that it is very different. It’s similar in structure to the Personal Information Protection Acts of Alberta and British Columbia, in that the principles of the CSA Model Code are not in a schedule but are in the body of the Act. And the language of these principles has necessarily been modified to be more statutory rather than the sort of language you see in an industry standards document.
<h1>Any changes to the 10 CSA Principles?</h1>
<P>The ten principles themselves largely haven’t been changed, and this should not be a surprise. Though written in the 90’s, they were based on the OECD guidelines and we see versions of all the ten principles in all modern privacy laws.</P>
<P>What has changed is the additional rigor that organizations have to implement, or more detail that’s been provided about how they have to comply with the law. </P>
<P>For example, principle 1 of the CSA model code required that an organization “implement policies and practices to give effect to the CSA Model Code principles”. The CCPA explicitly requires that an organization have a privacy management program:</P>
<blockquote>Privacy management program
<p>9 (1) Every organization must implement and maintain a privacy management program that includes the policies, practices and procedures the organization has put in place to fulfill its obligations under this Act, including policies, practices and procedures respecting</P>
<P>(a) the protection of personal information;</P>
<P>(b) how requests for information and complaints are received and dealt with;</P>
<P>(c) the training and information provided to the organization’s staff respecting its policies, practices and procedures; and</P>
<P>(d) the development of materials to explain the organization’s policies and procedures.</P>
<P>Volume and sensitivity</P>
<P>(2) In developing its privacy management program, the organization must take into account the volume and sensitivity of the personal information under its control.</blockquote>
<P>This privacy management program has to be provided to the Privacy Commissioner on Request. </P>
<P>With respect to consent, organizations expressly have to record and document the purposes for which any personal information is collected, used or disclosed. This was implied in the CSA Model Code, but is now expressly spelled out in the Act. </P>
<P>Section 15 lays out in detail what is required for consent to be valid. Essentially, it requires not only identifying the purposes but also communicating in plain language how information will be collected, the reasonably foreseeable consequences, what types of information and to whom the information may be disclosed. </P>
<P>I’ll have to save digging into the weeds for another episode.
<h1>Collection and use without consent</H1>
<P>One change compared to PIPEDA that will delight some and enrage others is the circumstances under which an organization can collect and use personal information without consent. Section 18 allows collection and use without consent for certain business activities, where it would reasonably be expected to provide the service, for security purposes, for safety or other prescribed activities. Notably, this exception cannot be used where the personal information is to be collected or used to influence the individual’s behaviour or decisions. </P>
<P>There is also a “legitimate interest” exception, which requires an organization to document any possible adverse effects on the individual, mitigate them and finally weigh whether the legitimate interest outweighs any adverse effects. It’s unclear how “adverse effects” would be measured.</P>
<P>Like PIPEDA, an individual can withdraw consent subject to similar limitations that were in PIPEDA. But what’s changed is that an individual can require that their information be disposed of. Notably, disposal includes deletion and rendering it anonymous.
<h1>Law enforcement access </h1>
<P>On a first review, it doesn’t look like there are many other circumstances where an organization can collect, use or disclose personal information compared to section 7 of PIPEDA. </P>
<P>In my view, it is very interesting that the exceptions that can apply when the government or the cops come looking for personal information have not changed from section 7(3) of PIPEDA. For example, the provision that the Supreme Court of Canada in R v Spencer said was meaningless is essentially reproduced in full. </P>
<blockquote>44 An organization may disclose an individual’s personal information without their knowledge or consent to a government institution or part of a government institution that has made a request for the information, identified its lawful authority to obtain the information and indicated that the disclosure is requested for the purpose of enforcing federal or provincial law or law of a foreign jurisdiction, carrying out an investigation relating to the enforcement of any such law or gathering intelligence for the purpose of enforcing any such law.</blockquote>
<P>The Supreme Court essentially said “what the hell does lawful authority mean”? And the government has made no effort to do so in Bill C-27. but that’s just as well, since Companies should always say “come back with a warrant”.
<h1>Investigations</h1>
<P>The big changes are with respect to the role of the Privacy Commissioner. The Commissioner is no longer an ombudsman with a focus on nudging companies to compliance and solving problems for individuals. It has veered strongly towards enforcement. </P>
<P>As with PIPEDA, enforcement starts with a complaint by an individual or the commissioner can initiate it on his own accord. There are more circumstances under the CCPA where the Commissioner can decline to investigate. After the investigation, the matter can be referred to an inquiry. </P>
<P>Inquiries seem to have way more procedural protections for fairness and due process than under the existing ad hoc system. For example, each party is guaranteed a right to be heard and to be represented by counsel. They’ve always done this to my knowledge, but this will be baked into the law. Also, the commissioner has to develop rules of procedure and evidence that have to be followed. These rules have to be made public. </P>
<P>At the end of the inquiry, the Commissioner can issue orders to measures to comply with the Act or to stop doing something that is in contravention of the Act. The commissioner can continue to name and shame violators. Notably, the Commissioner cannot levy any penalties. </P>
<P>The Commissioner can recommend that penalties be imposed by the new Privacy and Data Protection Tribunal.
<h1>The Tribunal</h1>
<P>The legislation creates a new specialized tribunal which hears cases under the CCPA. It is expected that its jurisdiction will likely grow to include more matters. The “online harms” consultation that took place in the last year anticipated that certain questions would be determined by this tribunal as well.</P>
<P>Compared to C-11, the new bill requires that at least three of the tribunal members have expertise in privacy. </P>
<P>Its role is to determine whether any penalties recommended by the Privacy Commissioner are appropriate. It also hears appeals of the Commissioner’s findings, appeals of interim or final orders of the Commissioner and a decision by the Commissioner not to recommend that any penalties be levied. </P>
<P>Currently, under PIPEDA, complainants and the Commissioner can seek a hearing in the federal court after the commissioner has issued his finding. That hearing is “de novo”, so that the court gets to make its own findings of fact and determinations of law, based on the submissions of the parties. The tribunal, in contrast, has a standard of review that is “correctness” for questions of law and “palpable and overriding error” for questions of fact or questions of mixed law and fact. These decisions are subject to limited judicial review before the Federal Court. </P>
<P>So what about these penalties? They are potentially huge and I have a feeling that the big numbers were pulled out of the air in order to support political talking points that they are the most punitive in the G7. The maximum administrative monetary penalty that the tribunal can impose in one case is the higher of $10,000,000 and 3% of the organization’s gross global revenue in its financial year before the one in which the penalty is imposed.</P>
<P>The Act also provides for quasi-criminal prosecutions, which can get even higher. </P>
<P>The Crown prosecutor can decide whether to proceed as an indictable offence with a fine not exceeding the higher of $25,000,000 and 5% of the organization’s gross global revenue or a summary offence with a fine not exceeding the higher of $20,000,000 and 4% of the organization’s gross global revenue. If it’s a prosecution, then the usual rules of criminal procedure and fairness apply, like the presumption of innocence and proof beyond a reasonable doubt. </P>
<P>
<div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-40971660974999625972022-05-29T16:27:00.001-03:002022-05-29T16:27:15.683-03:00The problem with Bill S-7: Device searches at the border
<p><iframe width="720" height="640" src="https://www.youtube.com/embed/CoR-nfCTwUQ" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<p>The government wants border agents to be able to search your smartphones and laptops without any suspicion that you’ve done anything wrong. I think that’s a problem. There are a lot of problematic bills currently pending before parliament but one in particular is not getting enough attention. It’s Bill S-7, called An Act to amend the Customs Act and the Preclearance Act, 2016. Today I’m going to talk about the bill, digital device searches and what I think about it all.</p>
<p>I don’t know about you, but my smartphone and my laptop contain a vast amount of personal information about me. My phone is a portal to every photo of my kids, messages to my wife, my banking and other information. It contains client information. And Canada Border Services Agency wants to be able to search it without any suspicion that I’d committed a crime or violated any law. </p>
<p><a href="https://www.parl.ca/LegisInfo/en/bill/41-2/s-7">Bill S-7</a>, which was introduced in the Senate on March 31, 2022, is intended to give the CBSA the power to go browsing through your smartphone and mine on what amounts to a whim. It also extends the same powers to US Homeland Security agents who carry out pre-departure pre-clearance at Canadian airports. </p>
<p>If you’ve ever watched the TV show “<a href="https://www.youtube.com/results?search_query=border+security+canada%27s+front+line">Border Security Canada</a>”, you would have seen how routine these sorts of searches are. Many of the searches do produce evidence of illegal activity, like smuggling, immigration violations and even importation of child sexual abuse materials. The question is not whether these searches should ever be permissible, but under what circumstances. The government wants it to be with a very low threshold, while I’m confident that the Charter requires more than that. </p>
<p>We all know there’s a reduced expectation of privacy at the border, where you can be pulled over to secondary screening and have your stuff searched. The Customs Act specifically gives CBSA the power to search goods. But a big problem has arisen because the CBSA thinks the ones and zeros in your phone are goods they can search. </p>
<p>Smartphones were unheard of when the search powers of the Customs Act were last drafted and the CBSA thinks it gives them carte blanche to search your devices. Now, in the meantime, the courts have rightly said that’s going too far. So the government is looking to amend the Customs Act to authorize device searches if the CBSA officer has a “reasonable general concern” about a contravention of the law. </p>
<p>One big issue is what the hell does “reasonable general concern” mean? In law, we’re used to language like “reasonable grounds to believe a crime has been committed” or even “reasonable grounds to suspect”, but reasonable general concern is not a standard for any sort of search in Canadian law. Your guess is as good as mine, but it seems pretty close to whether the officer's “spidey sense is tingling”. </p>
<p>S-7 is trying to fix a problem and I think the way they’re doing it will ultimately be found to be unconstitutional. To see that, we have to look at the competing interests at play in this context and look at what the courts have recently said about device searches at the border. </p>
<p>It is clear that you have a reduced expectation of privacy at the border, but it is not completely eliminated. And the Charter is not suspended at the border. For example, border officers can’t detain and strip search you just because they want to. These searches legally cannot be performed unless an officer has reasonable grounds to suspect some legal contravention, notably the concealment of goods. And they can’t strip search you unless there is a reason to do so, like looking for contraband smuggled on your person. </p>
<p>Meanwhile, there is a growing body of case law that says individuals have a very high expectation of privacy in our digital devices. For example, in a case called Fearon from 2014, the Supreme Court modified the common law rule related to search incident to arrest for smartphones, specifically due to the immense privacy implications in searching such devices. Upon arrest, they can routinely search you, your clothes and your belongings, but they can only search your smartphone if certain criteria are met. </p>
<p>The Supreme Court has clearly established that the greater the intrusion on privacy, the greater the constitutional protections and a greater justification for the search is required. And while there may be a diminished expectation of privacy at the border, this expectation is not completely extinguished. </p>
<p>At the same time, there has been a developing body of case law saying that suspicionless searches of personal electronic devices at the border violate the Charter. </p>
<p>The leading Supreme Court of Canada case on privacy at the border is from 1988 called <a href="https://canlii.ca/t/1ftcb">Simmons</a>. In that case, the Court recognized that the degree of personal privacy reasonably expected by individuals at the border is lower than in most other situations. Three distinct types of border searches, with an increasing degree of privacy expectation, were identified: (1) routine questioning which every traveller undergoes at a port of entry, sometimes accompanied by a search of baggage and perhaps a pat or frisk of outer clothing; (2) a strip or skin search conducted in a private room after a secondary examination; and (3) a body cavity search. The first category was viewed as the least intrusive type of routine search, not raising any constitutional issues or engaging the rights protected by the Charter. Essentially, this category can be done without any suspicion of wrongdoing. </p>
<p>So since then, customs agents have seen a search of a phone to be the same as the search of your luggage, which they conclude they can do without any suspicion of wrongdoing.</p>
<p>The Alberta Court of Appeal in 2020, in a case called <a href="https://canlii.ca/t/jb956">Canfield</a>, said that customs’ treatment of personal electronic devices was wrong, and it does not fit into that first category. The court noted: </p>
<blockquote>“There have been significant developments, both in the technology of personal electronic devices and in the law relating to searches of such devices, since Simmons was decided in 1988. A series of cases from the Supreme Court of Canada over the past decade have recognized that individuals have a reasonable expectation of privacy in the contents of their personal electronic devices, at least in the domestic context. While reasonable expectations of privacy may be lower at the border, the evolving matrix of legislative and social facts and developments in the law regarding privacy in personal electronic devices have not yet been thoroughly considered in the border context.” </blockquote>
<p>The court then said:</p>
<blockquote>“We have also concluded that s 99(1)(a) of the Customs Act is unconstitutional to the extent that it imposes no limits on the searches of such devices at the border, and is not saved by s 1 of the Charter. We accordingly declare that the definition of “goods” in s 2 of the Customs Act is of no force or effect insofar as the definition includes the contents of personal electronic devices for the purpose of s 99(1)(a).”</blockquote>
<p>The Court in Canfield essentially said there has to be a minimal threshold in order to justify a search of a digital device, but they would leave it to parliament to determine what that threshold is.</p>
<p>But the next year, the same Alberta Court of Appeal considered an appeal in a case called <a href="https://canlii.ca/t/jg55r">Al Askari</a>. In that case, the question was related to a search of a personal electronic device justified under immigration legislation. The Court found that like in Canfield, there has to be a threshold and it can’t be suspicionless. </p>
<p>The court commented favourably on the <a href="https://digitalcommons.schulichlaw.dal.ca/cgi/viewcontent.cgi?article=1155&context=cjlt">very reasoned approach</a> put forward by my friend and Schulich School of Law colleague Professor Robert Currie. </p>
<blockquote>“Prof Currie suggests that the critical issue is measuring the reasonably reduced expectation of privacy at the border and the extent of permissible state intrusion into it. In his view, this is best achieved through the established test in R v Collins, [1987] 1 SCR 265, 308. Was the search authorized by law? Is the law itself reasonable? Is the search carried out in a reasonable manner?</p>
<p>When assessing whether the law itself is reasonable, Prof Currie proposes a standard of reasonable suspicion because it is tailor-made to the border context. It must amount to more than a generalized suspicion and be based on objectively reasonable facts within the totality of the circumstances: 311. On the reasonableness of the search, he advocates for an inquiry into whether the search was limited in scope and duration.”</blockquote>
<p>The Court in both Canfield and Al Askari noted that not all searches are the same, and there are degrees of intrusion into personal electronic devices. Asking to look at a receipt for imported goods on a phone is very different from just perusing the full device looking for anything at all.</p>
<p>So fast forward to March 2022. The Alberta Court of Appeal said it’s up to Parliament to set the threshold and for the courts to determine whether it is compliant with the Charter. So Parliament is proposing a threshold of “reasonable general concern” to search documents on a personal digital device. This is setting things up for years of further litigation.</p>
<p>The creation of a ‘’reasonable general concern’ standard is not only new, and the bill doesn’t give it any sort of definition, it is inconsistent with other legislation governing border searches. It also does not impose any obligation that the type of search carried out must be appropriate to what is “of general concern” or set any limits on what can be searched on the device when the “reasonable general concern” (whatever that means) is met.</p>
<p>If you look at the case of Fearon, which addressed device searches incident to arrest, the court imposed a bunch of conditions and limits in order to take account of the nature of device searches. Importantly, the extent of the permitted search has to be appropriate to what they legitimately have an interest in. The court said:</p>
<blockquote>“In practice, this will mean that, generally, even when a cell phone search is permitted because it is truly incidental to the arrest, only recently sent or drafted emails, texts, photos and the call log may be examined as in most cases only those sorts of items will have the necessary link to the purposes for which prompt examination of the device is permitted. But these are not rules, and other searches may in some circumstances be justified. The test is whether the nature and extent of the search are tailored to the purpose for which the search may lawfully be conducted. To paraphrase Caslake, the police must be able to explain, within the permitted purposes, what they searched and why”</blockquote>
<p>In the border context, if they are looking for whether someone appearing on a tourism visa actually has a job waiting for them, you don’t go looking for evidence of that in their camera roll. You scan the subject lines of emails, and not go prowling through all the mail in the inbox.</p>
<p>Fearon also requires police to carefully document their searches, the rationale, what they looked at and why. There is no such requirement in Bill S-7.</p>
<p>Given years of growing jurisprudence confirming that personal electronic devices contain inherently private information, and the tendency of the courts to impose the creation of this lower threshold is unreasonable, inconsistent with other search standards, and anticipated to run afoul of the Charter.</p>
<p>I think after Canfiled and Al Askari, government lawyers and policy makers huddled and and tried to invent a threshold that could plausibly be called a threshold but was miles below reasonable suspicion. And this is what they came up with. You’ll note that they ignored all the really smart and sensible things that Professor Currie proposed.</p>
<p>What is also very notable is that the government ignored the recommendations made by the House of Commons Standing Committee on Access to Information, Privacy and Ethics in 2017 after it had carried out an extensive study and consultation on the issue of privacy at borders and airports. (I testified at those hearings on behalf of the Canadian Bar Association.) It recommended that the threshold of “reasonable grounds to suspect” should be the threshold. </p>
<p>The threshold is so low that it’s hardly a threshold at all. It’s a license for the CBSA to continue their practices of routinely searching electronic devices, and will continue the legal challenges. I just really wish the legislators would listen to the experts and the courts. </p>
<div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-17826381417115468272022-05-16T12:51:00.003-03:002022-07-29T11:22:31.975-03:00Video: Law enforcement requests for customer information - Come Back With A Warrant<p><iframe width="720" height="480" src="https://www.youtube.com/embed/Irp1UalqXn8" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<p>Canadian businesses are routinely asked by police agencies to provide customer information in order to further their investigations or intelligence gathering. The police generally do not care whether the business can legally disclose the information and, in my experience, the police are generally ignorant of privacy laws that restrict the ability of Canadian businesses to cooperate with law enforcement investigations.</p>
<P>For some time, there was some degree of uncertainty about the extent to which Canadian businesses could voluntarily provide information to the police upon request, but this uncertainty has been completely resolved so that it is clear that if the police come knocking, Canadian businesses must respond with “come back with a warrant”.</p>
<P>The uncertainty that used to exist is rooted in section 7 of the personal information protection and electronic documents act, also known as PIPEDA. Section 7 is that part of the law that allows businesses to collect, use or disclose personal information without the consent of individuals. Not surprisingly, there is a provision that dictates whether an organization can or cannot give the police customer information if the police come knocking. </p>
<P>Section 7(3)(c.1) allows a business to disclose personal information to a police agency upon request if they have indicated that the information is necessary for a range of purposes and have identified their lawful authority to obtain the information. There's another provision in the act that deals with what happens when the police show up with a warrant or a production order. </p>
<P>It is clear that in those circumstances, personal information can be disclosed. If it is a valid Canadian Court order, it is likely that not providing the information could subject the business to prosecution.</p>
<P>There's also a provision in the Canadian criminal code that makes it clear that the police can ask for anything from a person who is not prohibited by law from disclosing, which further fed this uncertainty.</p>
<P>So for some time in Canada, the police believed that businesses could disclose information without a warrant as long as it was associated with the lawful investigation. Police believed that the fact that they were investigating a crime is all the “lawful authority” they needed. </p>
<P>Where this would come up most often would be if police had identified illegal online conduct and had the IP address of a suspect. They would seek from an internet service provider the customer name and address that was associated with that IP address at that time. Without that information, they had no suspect to investigate and ISPs hold the keys connecting that IP address with a suspect.</p>
<P>The Canadian association of Internet providers actually concluded a form of protocol with Canadian police that would facilitate the provision of this information. Surprisingly, the CAIP was of the view that this was not private information. What would be required would be a written request from a police agency indicating that the information was relevant to an investigation of certain categories of online offenses, principally related to child exploitation. These letters cited that they were issued under the “authority of PIPEDA”, which is simply absurd. </p>
<P>It is my understanding that the internet providers were generally comfortable with providing this information in connection with such important investigations. For other categories of offenses, they would require a production order.</p>
<P>It is also my understanding that some internet providers fine-tuned their terms of service and privacy policies to permit these sorts of disclosures, so that the businesses would have additional cover by saying in fact the customer had consented to disclosure under these circumstances.</p>
<P>One thing to bear in mind, of course, is that this provision in PIPEDA is permissive, meaning that if this interpretation was correct businesses could voluntarily provide this information, but does not compel them to do so. They could always insist on a court order, but very often did not.</p>
<P>Some courts found this agreeable and found that evidence provided voluntarily under this scheme was permissible, while other courts found it to be a violation of the suspect’s Section 8 rights under the Charter. </p>
<P>Then along came a case called <a href=”https://canlii.ca/t/g7dzn”>R. v Spencer</a>. In this case, a police officer in Saskatoon, Saskatchewan detected someone sharing a folder containing child pornography using a service called LimeWire. The officer was able to determine the IP address of the internet connection being used by that computer and was able to determine that the IP address was allocated to a customer of Shaw Communications. So the cop sent a written “law enforcement request” to Shaw and Shaw handed over the customer information associated with the account. The cops did not try to obtain a production order first. </p>
<P>The IP address was actually in the name of the accused’s sister.</p>
<P>It finally found its way up to the Supreme Court of Canada where the court had to determine whether the request was a “search” under the Charter. It was. And then the question was whether the search was authorized by law. The Court said it was not. </p>
<P>The police and prosecution, of course, argued that this is just “phone book information” that doesn’t implicate any serious privacy issues. The court disagreed, quoting from a Saskatchewan Court of Appeal decision from 2011 called Trapp:</p>
<blockquote>“To label information of this kind as mere “subscriber information” or “customer information”, or nothing but “name, address, and telephone number information”, tends to obscure its true nature. I say this because these characterizations gloss over the significance of an IP address and what such an address, once identified with a particular individual, is capable of revealing about that individual, including the individual’s online activity in the home.”</blockquote>
<P>Justice Cromwell writing for the court concluded that “Here, the subject matter of the search is the identity of a subscriber whose Internet connection is linked to particular, monitored Internet activity.”</p>
<P>The court said that constitutionally protected privacy includes anonymity. Justice Cromwell wrote, and then quoted from the Spencer decision of the Court of Appeal:</p>
<blockquote>[51] I conclude therefore that the police request to Shaw for subscriber information corresponding to specifically observed, anonymous Internet activity engages a high level of informational privacy. I agree with Caldwell J.A.’s conclusion on this point:
<blockquote>. . . a reasonable and informed person concerned about the protection of privacy would expect one’s activities on one’s own computer used in one’s own home would be private. . . . In my judgment, it matters not that the personal attributes of the Disclosed Information pertained to Mr. Spencer’s sister because Mr. Spencer was personally and directly exposed to the consequences of the police conduct in this case. As such, the police conduct prima facie engaged a personal privacy right of Mr. Spencer and, in this respect, his interest in the privacy of the Disclosed Information was direct and personal. </blockquote></blockquote>
<P>The court then was tasked with considering what “lawful authority” means in subsection 7(3)(c.1). </p>
<P>The court concluded that the police, carrying out this investigation, did not have the lawful authority that would be required to trigger and permit the disclosure under the subsection. Well the police can always ask for the information, they did not have the lawful authority to obtain the information. If they had sought a production order, their right to obtain the information and Shaw's obligation to disclose it would be clear.</p>
<P>What the court did not do was settle what exactly lawful authority means. It does not mean a simple police investigation, even for a serious crime, but what it might include remains unknown.</p>
<P>What is clear, however, is the end result that this subsection of PIPEDA simply does not permit organizations to hand over customer information simply because the police agency is conducting a lawful investigation. If they want the information, they have to come back with a court order.</p>
<P>Just a quick note about other forms of legal process. While production orders are the most common tool used by law enforcement agencies to seek and obtain customer information, a very large number of administrative bodies are able to use different forms of orders or demands. For example, the CRTC spam investigators can use something called a notice to produce under the anti-spam legislation, which is not reviewed or approved by judge in advance.</p>
<P>It is not uncommon for businesses to receive subpoenas, and they need to tread very carefully and read the details of the subpoena. In order to comply with privacy legislation, the organization can only do what it is directed to do in The subpoena, no more. In the majority of cases, the subpoena will direct the company to send somebody to court with particular records. Just sending those records to the litigants or the person issuing the subpoena is not lawful.</p>
<P>Before I wrap up, it should be noted that the rules are different if it is the business itself reporting a crime. Paragraph (c.1) applies where the police come knocking looking for information. Paragraph d is the provision that applies where the organization itself takes the initiative to disclose information to the police or a government institution. It's specifically says that an organization May disclose personal information without consent where it is made on the initiative of the organization to a government institution and the organization has reasonable grounds to believe that the information relates to a contravention of the laws of Canada, a province or foreign jurisdiction that has been, is being or is about to be committed. </p>
<P>This paragraph gives much more discretion to the organization, but it is still limited to circumstances where they have reasonable grounds to believe sub-paragraph 1 applies and they can only disclose the minimum amount of personal information that's reasonably necessary for these purposes.</p>
<P>A scenario that comes up relatively often would be if a store is robbed, and there is surveillance video of the robbery taking place including the suspect. The store can provide that video to the police on their own initiative. Contrast that to another common scenario, where the police are investigating a crime and evidence may have been captured on surveillance video. If it is the police asking for it, and not the organization reporting it on their own initiative, the police have to come back with a court order.</p>
<P>At the end of the day, the safest and smartest thing that a business can do when asked for any customer personal information is to simply say come back with a warrant. Even if you think you can lawfully disclose the information, it simply makes sense that it be left to an impartial decision maker such as a judge or a Justice of the Peace to do the balancing between the public interest in the police having access to the information and the individual privacy interest at play.</blockquote>
<div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-69228763005014453532022-05-12T19:53:00.005-03:002022-05-12T19:53:52.632-03:00Presentation: Privacy civil claims<p>I had the honour this week of presenting to a continuing education event for judges on privacy civil claims, past, present and future. I was jointed by <a href="https://www.fasken.com/en/antoine-aylwin">Antoine Aylwin</a> and <a href="https://law.uwo.ca/about_us/faculty/erika_chamberlain.html">Erika Chamberlain</a>.
<p>To make it a little more daunting, some of the judges who wrote the decisions I referred to were in the room...
<p>It may be of interest to the privacy nerds who follow my blog, so here's the presentation:
<p><iframe src="https://docs.google.com/presentation/d/e/2PACX-1vQGNmc-KIcWpVz-GOyyyn0wIMg43ANRSKowxfbGJV9jSG6Bs_RmvplyB6QGGq4M3A/embed?start=false&loop=true&delayms=3000" frameborder="0" width="480" height="389" allowfullscreen="true" mozallowfullscreen="true" webkitallowfullscreen="true"></iframe><div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-67249701732206375002022-05-05T12:58:00.002-03:002022-05-05T12:58:04.908-03:00Presentation: Lawyers and InfoSec professionals - playing nicely with lawyers to provide more value in your engagements<p>I was very kindly invited back to give a keynote at the <a href="https://members.htcia.org/events/Details/2022-htcia-canada-cyber-summit-407535?sourceTypeId=Website">Canadian Cyber Summit</a> for the <a href="https://htcia.org">High Technology Crime Investigation Association</a>. I spoke about the role of lawyers in incident response and how greater understanding between lawyers and the technical folks of their respective roles can add value to the overall engagement. I also discussed the importance of legal advice privilege in indicent response. Here is a copy of the <a href="https://docs.google.com/presentation/d/18Qs66lis-oo3I-TH0UZVTRJTftKnBSiC/edit?usp=sharing&ouid=109239855867197881918&rtpof=true&sd=true" target="_blank">presentation I gave</a>, in case it's of interest ...
<p>
<p><iframe src="https://docs.google.com/presentation/d/e/2PACX-1vSgDUBG4GEin49c_N6qRixWSctVtScOfB1XbDMCC7hichyAaHCmgVM1-9crXXx8bg/embed?start=false&loop=false&delayms=15000" frameborder="0" width="480" height="389" allowfullscreen="true" mozallowfullscreen="true" webkitallowfullscreen="true"></iframe><div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-32206009974191164602022-04-18T10:10:00.004-03:002022-07-29T11:21:15.484-03:00Video: Canada's Anti-Spam Law and the installation of software<p><iframe width="720" height="480" src="https://www.youtube.com/embed/Kg9hHi6Kl8c" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<P>Canada’s anti-spam law is about much more than just spam. It also regulates the installation of software. Like the rest of the law, it is complicated and convoluted and has significant penalties. If you’re a software developer or an IT admin, you definitely need to know about this. </p>
<P>So we’re talking about Canada’s anti-spam law. The official title is much longer, and it also includes two sets of regulations to make it more complicated.</p>
<P>Despite the snappy title that most of us use: Canada’s Anti-Spam Law, it is about more than just spam. It has often-overlooked provisions that make it illegal to install software on another person’s computer – or cause it to be installed – without their consent.</p>
<P>It was clearly put into the law to go after bad stuff like malware, viruses, rootkits, trojans, malware bundled with legitimate software and botnets. But it is not just limited to malevolent software. It potentially affects a huge range of software.</p>
<P>So here is the general rule from Section 8 of the Act:</p>
<blockquote>8. (1) A person must not, in the course of a commercial activity, install or cause to be installed a computer program on any other person’s computer system or, having so installed or caused to be installed a computer program, cause an electronic message to be sent from that computer system, unless</p>
<P>(a) the person has obtained the express consent of the owner or an authorized user of the computer system and complies with subsection 11(5); or</p>
<P>(b) the person is acting in accordance with a court order.</blockquote>
<P>Let’s break that down. The first part is that it has to be part of a commercial activity. I’m not sure they meant to let people off the hook if they’re doing it for fun and giggles. The “commercial activity” part is likely there so that the government can say this is justified under the federal “general trade and commerce power”. </p>
<P>They could have used the federal criminal law jurisdiction, but then they’d be subject to the full due process and fairness requirements of the Canadian Charter of Rights and Freedoms, and the government did not want to do that. They’d rather it be regulatory and subject to much lower scrutiny. </p>
<P>Then it says you can’t install a computer program on another’s computer without the express consent of the owner or authorized user of the computer. (The definition of “computer system” would include desktops, laptops, smartphones, routers and appliances.) or cause to be installed. </p>
<P>The express consent has to be obtained in the manner set out in the Act, and I’ll discuss that later. </p>
<P>It also additionally prohibits installing a computer program on another’s computer and then causing it to send electronic messages. This makes creation of botnets for sending spam extra bad.</p>
<P>The definition of the term “Computer Program” is taken from the Criminal Code of Canada</p>
<blockquote>“computer program” means computer data representing instructions or statements that, when executed in a computer system, causes the computer system to perform a function; (programme d’ordinateur)</blockquote>
<P>In addition to defined terms, there are some key ideas and terms in the Act that are not well-understood. </p>
<P>It talks about “installing” a computer program, but what that is has not been defined in the legislation and the CRTC hasn’t provided any helpful guidance. </p>
<P>I wouldn’t think that running malware once on someone’s system for a malevolent purpose would be captured in the definition of “install”, though it likely is the criminal offence of mischief in relation to data. </p>
<P>What about downloading source code that is not yet compiled? Or then compiling it?</p>
<P>It is certainly possible to load up software and have it ready to execute without being conventionally “installed”. Does it have to be permanent? Or show up in your installed applications directory? </p>
<P>I don’t know. </p>
<P>There’s also the question of who is an owner or an authorized user of a computer system. </p>
<P>If it is leased, the leasing company likely owns the computer and we’ve certainly seen reports and investigations of spyware and intrusive software installed on rented and leased laptops. </p>
<P>My internet service provider owns my cable modem, so it’s apparently ok if they install malware on it. </p>
<P>For authorized users, it means any person who is authorized by the owner to use the computer system. Interestingly, it is not limited by the scope of the authorization. It seems to be binary. Either you are authorized or you are not. </p>
<P>There are some scenarios to think about when considering owners and authorized users.</p>
<P>For example, if a company pre-installs software on a device at the factory or before ownership transfers to the end customer, that company is the owner of the device and can install whatever they like on it.</p>
<P>Many companies issue devices like laptops and smartphones to employees. Those employers own the devices and can install any software on them. </p>
<P>But increasingly, employees are using devices that they own for work-related purposes, and employers may have a legitimate interest in installing mobile device management and security software on those devices. Unless there’s a clear agreement that the employer gets to do so, they may find themselves to be offside the law.</p>
<P>So, in short, it is prohibited to do any of these things without the express consent of the owner or authorized user:</p>
<ul><li>(a) install a computer program of any kind;
<li>(b) cause a computer program of any kind to be installed, such as hiding or bundling additional software in an installer that the owner or authorized user has installed. We sometimes see this when downloading freeware or shareware, and the installer includes other software that the user didn’t ask for;
<li>(c) or cause such a program that has been installed to send electronic messages after installation. </li></ul>
<P>Of course, someone who is the owner or authorized user of the particular device can put whatever software they want on the device. This only covers installation by people who are not the owner or the authorized user of the device.</p>
<P>There are some exceptions that people should be aware of.</p>
<P>It is common to install software and to have it automatically update. This is ok if the user consents to the auto updates. But that probably doesn't apply if the update results in software that does very different things compared to when it was first installed. </p>
<P>There are some cases where consent is deemed or implied. </p>
<P>CASL deems users to consent to the installation of the following list of computer programs if the user’s conduct shows it is reasonable to believe they consented to it. It is a weird list. </p>
<P>At the top of the list are “cookies”. To start with, anyone who knows what cookies are knows they are not computer programs. They are text files, and including them on this list tells me that the people who wrote this law may not know as much as you may hope about this subject. </p>
<P>It then includes HTML code. HTML is hypertext markup language. I suppose it is data that represents instructions to a computer on how to display text and other elements. I guess the next question is whether this includes the variations of HTML like XHTML? I don’t know. But if HTML is a computer program, then so are fonts and Unicode character codes. </p>
<P>Next it refers to “Java Scripts”. Yup. That’s what it says. We are told by industry Canada that this is meant to refer to JavaScript, which is different from a Java script. Not only could have have maybe not made such a stupid mistake, but maybe they could have been clear about whether they were referring to JavaScript run in a browser (with its attendant sandbox) or something else.</p>
<P>Next on the list are “operating systems”, which seems very perverse to include. The operating system is the mostly invisible layer that lies between the computer hardware and the software that runs on top of it. Changes to the operating system can have a huge impact on the security and privacy of the user, and much of it happens below the system. And there is no clarity about whether an “operating system” on this list includes the software that often comes bundled with it. When I replace the operating system on my Windows PC, I get a new version of a whole bunch of standard software that comes with it like the file explorer and notepad. It would make sense that a user who upgrades from one version of MacOS or Windows to another. But I can make an open source operating system distro that’s full of appalling stuff, in addition to the operating system.</p>
<P>Finally, it says any program executable only through use of another computer program for which the user has already consented to installation. Does this include macros embedded in word documents? Not sure. </p>
<P>It makes sense to have deemed consent situations or implied consent, but we could have used a LOT more clarity. </p>
<P>There are some exceptions to the general rule of getting consent, two of which are exclusively reserved to telecommunications service providers, and a final one that related to programs that exclusively correct failures in a computer system or a computer program.</p>
<P>This is understandable, but this would mean that a telco can install software on my computer without my knowledge or consent if it’s to upgrade their network. </p>
<P>So how do you get express consent. It’s like the cumbersome express consent for commercial electronic messages, but with more. </p>
<P>When seeking express consent, the installer has to identify </p>
<ul><li>the reason;</li>
<li>Their full business name;</li>
<li>Their mailing address, and one of: telephone number, email address, or web address;</li>
<li>if consent is sought on behalf of another person, a statement indicating who is seeking consent and on whose behalf consent is being sought;</li>
<li>a statement that the user may withdraw consent for the computer program’s installation at any time; and</li>
<li>a clear and simple description, in general terms, of the computer program’s function and purposes. </li></ul>
<P>But if an installer “knows and intends” that a computer program will cause a computer system to operate in a way its owner doesn’t reasonably expect, the installer must provide a higher level of disclosure and acknowledgement to get the user’s express consent. </p>
<P>This specifically includes the following functions, all of which largely make sense: </p>
<ul><li>collecting personal information stored on the computer system;</li>
<li>interfering with the user’s control of the computer system;</li>
<li>changing or interfering with settings, preferences, or commands already installed or stored on the computer system without the user’s knowledge;</li>
<li>changing or interfering with data stored on the computer system in a way that obstructs, interrupts or interferes with lawful access to or use of that data by the user;</li>
<li>causing the computer system to communicate with another computer system, or other device, without the user’s authorization;</li>
<li>installing a computer program that may be activated by a third party without the user’s knowledge; and</li>
<li>performing any other function CASL specifies (there are none as yet).</li></ul>
<P>Like the unsubscribe for commercial electronic messages, anyone who installs software that meets this higher threshold has to include an electronic address that is valid for at least one year to the user can ask the installer to remove or disable the program. </p>
<P>A user can make this request if she believes the installer didn’t accurately describe the “function, purpose, or impact” of the computer program when the installer requested consent to install it. If the installer gets a removal request within one year of installation, and consent was based on an inaccurate description of the program’s material elements, then the installer must assist the user in removing or disabling the program as soon as feasible – and at no cost to the user. </p>
<P>So how is this enforced? CASL is largely overseen by the enforcement team at the Canadian Radio-television and Telecommunications Commission. </p>
<P>Overall, I see them at least making more noise about their enforcement activities in the software arena than the spam arena. </p>
<P>In doing this work, the CRTC has some pretty gnarly enforcement tools. </p>
<P>First of all, they can issue “notices to produce” which are essentially similar to Criminal Code production orders except they do not require judicial authorization. These can require the recipient of the order to hand over just about any records or information, and unlike Criminal Code production orders, they can be issued without any suspicion of unlawful conduct. They can be issued just to check compliance. I should do a whole episode on these things, since they really are something else in the whole panoply of law enforcement tools. </p>
<P>They can also seek and obtain search warrants, which at least are overseen and have to be approved by a judge. </p>
<P>Before CASL, I imagine the CRTC was entirely populated by guys in suits and now they get to put on raid jackets, tactical boots and a badge.</p>
<P>I mentioned before that there can be some significant penalties for infractions of CASL’s software rules. </p>
<P>It needs to be noted that contraventions involve “administrative monetary penalties” - not a “punishment” but intended to ensure compliance. These are not fines per se and are not criminal penalties. That’s because if they were criminal or truly quasi-criminal, they’d have to follow the Charter’s much stricter standards for criminal offences. </p>
<P>The maximum for these administrative monetary penalties are steep. Up to $1M for an individual offender and $10M for a corporation. </p>
<P>The legislation sets out a bunch of factors to be considered in determining the amount of penalty, including the ability of the offender to pay. </p>
<P>There is a mechanism similar to a US consent decree where the offender can give an “undertaking” that halts enforcement, but likely imposes a whole bunch of conditions that will last for a while. </p>
<P>Officers and directors of companies need to know they may be personally liable for penalties and of course the CRTC can name and shame violators.</p>
<P>There is a due diligence defence, but this is a pretty high bar to reach. </p>
<P>We have seen at least three reported enforcement actions under the software provisions of CASL. </p>
<P>The first was involving two companies called Datablocks and Sunlight Media in 2018. They were found by the CRTC to be providing a service to others to inject exploits onto users’ computers through online ads. They were hit with penalties amount to $100K and $150K, respectively. </p>
<P>The second was in 2019 and involved a company called Orcus Technologies, which was said to be marketing a remote access trojan. They marketed it as a legitimate tool, but the CRTC concluded this was to give a veneer of respectability to a shady undertaking. They were hit with a penalty of $115K. </p>
<P>The most recent one, in 2020, involved a company called Notesolution Inc. doing business as OneClass. They were involved in a shady installation of a Chrome extension that collected personal information on users’ systems without their knowledge or expectation. They entered into an undertaking, and agreed to pay $100K.</p>
<P>I hope this has been of interest. The discussion was obviously at a pretty high level, and there is a lot that it unknown about how some of the key terms and concepts are being interpreted by the regulator. </p>
<P>If you have any questions or comments, please feel free to leave them below. I read them all and try to reply to them all as well. If your company needs help in this area, please reach out. My contact info is in the notes, as well. </p>
<div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-12061127144019834522022-04-13T08:34:00.000-03:002022-04-13T08:34:08.741-03:00Video: Privacy and start-ups ... what founders need to know<p><iframe width="720" height="480" src="https://www.youtube.com/embed/B7qXe7pcAPI" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<P>In my legal practice, I have seen businesses fail because they did not take privacy into account. I’ve seen customers walk away from deals because of privacy issues and I’ve seen acquisitions fail due diligence because of privacy. </p>
<p>Today, I’m going to be talking about privacy by design for start-ups, to help embed privacy into growing and high-growth businesses. </p>
<p>Episode 2 of Season 4 of HBO’s “Silicon Valley” provides a good case study on the possible consequences of not getting privacy compliance right. </p>
<p><iframe width="720" height="480" src="https://www.youtube.com/embed/N3zU7sV4bJE" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<p>Privacy means different things to different people. And people have wildly variable feelings about privacy. As a founder, you need to understand that and take that into account. </p>
<p>In some ways, privacy is about being left alone, not observed and surveilled. </p>
<p>It is about giving people meaningful choices and control. They need to understand what is happening with their personal information and they should have control over it. What they share and how it is used. They should get to choose whether something is widely disseminated or not. </p>
<p>Privacy is also about regulatory compliance. As a founder you need to make sure your company complies with the regulatory obligations imposed on it. If you are in the business to business space, you will need to understand the regulatory obligations imposed on your customers. I can guarantee you that your customers will look very, very closely at whether your product affects their compliance with their legal obligations. And they’ll walk away if there’s any realistic chance that using your product puts their compliance at risk.</p>
<p>Privacy is about trust in a number of ways. If you are in the business to consumer space, your end-users will only embrace your product if they trust it. If they know what the product is doing with their information and they trust you to keep it consistent. If you are in the business to business space, your customers will only use your product or service if they trust you. If you’re a start-up, you don’t yet have a track record or wide adoption to speak on your behalf. A deal with a start-up is always a leap of faith, and trust has to be built. And there are a bunch of indicators of trustworthiness. I have advised clients to walk away from deals where the documentation, responses to questions don’t suggest privacy maturity. If you have just cut and pasted your privacy policy from someone else, we can tell. </p>
<p>Privacy is not just security, but security is critical to privacy. Diligent security is table stakes. And a lack of security is the highest risk area. We seldom see class-action lawsuits for getting the wrong kind of consent, but most privacy/security breaches are followed by class-action lawsuits. Your customers will expect you to safeguard their data with the same degree of diligence as they would do it themselves. In the b2b space, they should be able to expect you to do it better. </p>
<p>You need to make sure there are no surprises. Set expectations and meet them.</p>
<p>In my 20+ years working with companies on privacy, one thing is clear. People don’t like it when something is “creepy”. Usually this is a useless word, since the creepy line is drawn very differently for different people. But what I’ve learned is that where the creepy line is depends on their expectations. But things are always creepy or off-putting when something happens with your personal information that you did not expect. </p>
<p>As a founder, you really have to realize that regardless of whether or not you care about privacy yourself, your end users care about privacy. Don't believe the hype, privacy is far from dead.</p>
<p>If you are in the business to business arena, your customers are going to care very deeply about the privacy and security of the information that they entrust you with. If you have a competitor with greater privacy diligence or a track record, you have important ground to make up.</p>
<p>And, of course, for founders getting investment is critical to the success of their business. The investors during your friends and family round or even seed funding might not be particularly sophisticated when it comes to privacy. But Mark my words, sophisticated funds carry out due diligence and know that privacy failures can often equal business failures. I have seen investments go completely sideways because of privacy liabilities that are hidden in the business. And when it comes time to make an exit via acquisition, every single due diligence questionnaire has an entire section if not a chapter on privacy and security matters. The weeks leading up to a transaction are not the time to be slapping Band-Aids on privacy problems that were built into the business or the product from the very first days. As a founder, you want to make sure that potential privacy issues are, at least, identified and managed long before that point.</p>
<p>The borderless world</p>
<p>I once worked with a founder and CEO of a company who often said that if you are a startup in Canada, and your ambition is the Canadian market, you have set your sights too low and you are likely to fail. The world is global, and digital is more global than any other sector. You might launch your minimally viable product or experiment with product market fit in the local marketplace, but your prospective customers are around the world. This also means that privacy laws around the world are going to affect your business.</p>
<p>If your product or services are directed at consumers, you will have to think about being exposed to and complying with the privacy laws of every single jurisdiction where your end users reside. That is just the nature of the beast.</p>
<p>If you're selling to other businesses, each of those businesses are going to be subject to local privacy laws that may differ significantly from what you're used to. Once you get into particular niches, such as processing personal health information or educational technology, the complexity and the stakes rise significantly.</p>
<p>You definitely want to consult with somebody who is familiar with the alphabet soup of PIPEDA, PIPA, CASL, PHIA, GDPR, COPPA, CCPA, CPRA, HIPAA. </p>
<p>You're going to want to talk carefully and deeply with your customers to find out what their regulatory requirements are, which they need to push down onto their suppliers.</p>
<p>The consequences of getting it wrong can be significant. You can end up with a useless product or service, one that cannot be sold or that cannot be used by your target customers. I’ve seen that happen. </p>
<p>A privacy incident can cause significant reputational harm, which can be disastrous as a newcomer in a marketplace trying to attract customers.</p>
<p>Fixing issues after the fact is often very expensive. Some privacy and security requirements may mandate a particular way to architect your back-end systems. Some rules may require localization for certain customers, and if you did not anticipate that out of the gate, implementing those requirements can be time and resource intensive.</p>
<p>Of course, there's always the possibility of regulatory action resulting in fines and penalties. Few things stand out on a due diligence checklist like having to disclose an ongoing regulatory investigation or a hit to your balance sheet caused by penalties.</p>
<p>All of these, individually or taken together, can be a significant impediment to closing an investment deal or a financing, and can be completely fatal to a possible exit by acquisition.</p>
<p>So what's the way to manage this? It's something called privacy by design, which is a methodology that was originally created in Canada by Dr Ann Cavoukian, the former information and privacy commissioner of Ontario.</p>
<p>Here's what it requires at a relatively high level. </p>
<p>First of all, you need to be proactive about privacy and not reactive. You want to think deeply about privacy, anticipate issues and address them up front rather than reacting to issues or problems as they come up.</p>
<p>Second, you need to make privacy the default. You need to think about privacy holistically, focusing particularly on consumers and user choice, and setting your defaults to be privacy protective so that end users get to choose whether or not they deviate from those privacy protective defaults.</p>
<p>Third, you need to embed privacy into your design and coding process. Privacy should be a topic at every project management meeting. I'll talk about the methodology for that in a couple minutes.</p>
<p>You need to think about privacy as positive sum game rather than a zero-sum game. Too often, people think about privacy versus efficiency, or privacy versus innovation, or privacy versus security. You need to be creative and think about privacy as a driver of efficiency, innovation and security.</p>
<p>Fifth, you need to build in end-to-end security. As I mentioned before, security may in fact be the highest risk area given the possibility of liability and penalties in this area, you need to think about protecting end users from themselves, from their carelessness, and from all possible adversaries.</p>
<p>Sixth, you need to build visibility and transparency. Just about every single privacy law out there requires that an organization be open and transparent about its practices. In my experience, the more proactive an organization is in talking about privacy and security, and how they address it, it is a significant “leg up” compared to anybody else who does not.</p>
<p>Seventh, and finally, you need to always be aware that and users are human beings who have a strong interest in their own privacy. They might make individual choices that differ from your own privacy comfort levels, but that is human. Always look at your product and all of your choices through the eyes of your human and users. Think about how you will explain your product and services to an end user, and can the choices that you have made in its design be justified to them?</p>
<p>A key tool to implement this is to document your privacy process and build it iteratively into your product development process. For every single product or feature of a product, you need to document what data from or about users is collected. What data is generated? What inferences are made? You will want to get very detailed, knowing every single data field that is collected or generated in connection with your product.</p>
<p>Next you need to carefully document how each data element is used? Why do you need that data, how do you propose to use it and is it necessary for that product or feature? If it is not “must have” but “good to have”, how do you build that choice into your product?</p>
<p>You need to ask “is this data ever externally exposed”? Does it go to a third party to be processed on your behalf, is it ever publicly surfaced? Are there any ways that the data might be exposed to a bad guy or adversary?</p>
<p>In most places, privacy regulations require that you give individual users notice about the purposes for which personal information is collected, used or disclosed. You need to give users control over this. How are the obligations for notice and control built into your product from day one? When a user clicks a button, is it obvious to them what happens next?</p>
<p>You will then need to ask “where is the data”? Is it stored locally on a device or server managed by the user or the customer? Is it on servers that you control? Is it a combination of the two? Is the data safe, wherever it resides? To some people, local on device storage and processing is seen as being more privacy protective than storage with the service provider. But in some cases, those endpoints are less secure than a data center environment which may have different risks.</p>
<p>Finally, think about life cycle management for the data. How long is it retained? How long do you or the end user actually need that information for? If it's no longer needed for the purpose identified to the end user, it should be securely deleted. You'll also want to think about giving the end user control over deleting their information. In some jurisdictions, this is a legal requirement.</p>
<p>Everybody on your team needs to understand privacy as a concept and how privacy relates to their work function. Not everybody will become a subject matter expert, but a pervasive level of awareness is critical. Making sure that you do have subject matter expertise properly deployed in your company is important.</p>
<p>You also have to understand that it is an iterative process. Modern development environments can sometimes be likened to building or upgrading an aircraft while it is in flight. You need to be thinking of flight worthiness at every stage. </p>
<p>When a product or service is initially designed, you need to go through that privacy design process to identify and mitigate all of the privacy issues. No product should be launched, even in beta until those issues have been identified and addressed. And then any add-ons or enhancements to that product or service need to go through the exact same scrutiny to make sure that no new issues are introduced without having been carefully thought through and managed.</p>
<p>I have seen too many interesting and innovative product ideas fail because privacy and compliance simply was not on the founder’s radar until it is too late. I have seen financing deals derailed and acquisitions tanked for similar reasons.</p>
<p>Understandably, founders are often most focused on product market fit and a minimally viable product to launch. But you need to realize that a product that cannot be used by your customers or that has significant regulatory and compliance risk is not a viable product.</p>
<p>I hope this has been of interest. The discussion was obviously at a pretty high level, but my colleagues and I are always happy to talk with startup founders to help assess the impact of privacy and compliance on their businesses.</p>
<p>If you have any questions or comments, please feel free to leave them below. I read them all and try to reply to them all as well. If your company needs help in this area, please reach out.</p>
<p>And, of course, feel free to share this with anybody in the startup community for whom it may be useful.
<div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-21855948878418467312022-03-27T08:34:00.003-03:002022-03-27T08:34:46.657-03:00Video: Canada - US announce beginning of CLOUD Act negotiations<iframe width="640" height="480" src="https://www.youtube.com/embed/UY4cyd3rlsU" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<P>
<hr>
<P>Today, I’m going to be talking about the newly announced “CLOUD Act” agreement negotiation process between Canada and the US to facilitate cross-border law enforcement investigations. </P>
<P>This is just beginning, so I’ll necessarily be doing some speculating. </P>
<P>This week, the United States Department of Justice announced that the governments of the US and Canada are currently negotiating an agreement under the CLOUD Act to facilitate cross-border law enforcement investigations.</P>
<P>This is a big deal. This will mean that Canadian police can use Canadian court orders to get evidence in the US, and American search warrants can be served on Canadians. </P>
<P>It is intended to be a solution to an issue that affects law enforcement in both countries who want evidence that is on the other side of the border. </P>
<P>Every country has absolute sovereignty over what happens in their territory</P>
<P>No “sovereign” can do anything in another sovereign’s territory without permission or invitation. </P>
<P>Canadian enforcement powers end – abruptly – at the border. A criminal court can’t order anyone outside of its jurisdiction to do anything, including the production of records.</P>
<P>It’s reciprocal: foreign states can’t extend their law enforcement into Canada without permission or invitation.</P>
<P>As it currently stands, a US search warrant has no effect in Canada. A Canadian production order has no effect in the US. Canadian law ends at the border, as does American law. </P>
<P>The Criminal Code does not authorize the issuance of a production order directed at a person or entity outside of Canada. </P>
<P>(It is important to remember that there’s a big difference between civil lawsuits and criminal investigations.)</P>
<P>Notice I said “without permission or invitation”. To provide that permission, countries have often entered into mutual legal assistance treaties with one another. If you’re investigating something in our country and some evidence is in our country, tell us about it and maybe we’ll assist you in getting it. I’ll discuss this a bit more later. </P>
<P>The reality is that most reputable US service providers will provide information to Canadian law enforcement under a Canadian production order, as long as they can do so without risking a violation of US law. </P>
<P>For example, in the first half of 2021, Twitter reports that it received 56 information requests about 63 accounts and it complied with 45% of them.</P>
<P>During the same time, Meta/Facebook reports it received 1,110 “legal process requests” from Canada and complied with 82% of the requests it received. </P>
<P>As I said, a Canadian production order doesn’t really have any effect in the US. But they generally do follow them, voluntarily, when they can. </P>
<P>Currently, a US privacy law called the Stored Communications Act prevents certain service providers from providing certain categories of data except with a qualifying US warrant. This annoys a lot of Canadian investigators, who have to go through formalities under the Mutual Legal Assistance Treaty between the two countries in order to get a US qualifying warrant. </P>
<P>A CLOUD Act agreement would remove that barrier and permit most US warrants for records and information to have effect in Canada. It is reciprocal, so Canadian law enforcement can get court orders in Canada for records that are in the custody of American service providers. </P>
<h1>What is the CLOUD Act?</h1>
<P>The CLOUD Act, or “Clarifying Lawful Overseas Use of Data Act”, was enacted in 2018. At the time, it got a lot of attention because it rendered moot a very high profile case in which US law enforcement was looking for data stored by Microsoft in one of their data centres in Ireland. Microsoft sensibly resisted the order, saying that US law did not extend to data that was outside of the US. </P>
<P>The case finally found its way to the Supreme Court of the United States, but before a decision was rendered, the US enacted the CLOUD Act that made it clear that a US warrant could compel US companies to provide stored data for a customer or subscriber on any server they own and operate, regardless where it is located, when demanded by warrant. The CLOUD Act also has a mechanism to challenge the warrant if they believe the request violates the privacy rights of the foreign country the data is stored in.</P>
<P>What the CLOUD Act also does is create a framework by which the US government can negotiate agreements with other governments for mutual recognition of the other country’s legal processes, subject to limitations set out in the agreement. </P>
<P>Before coming into effect, the bilateral or multi-lateral agreement needs to be put before the US congress, and the US Attorney General has to certify that the partner country has robust substantive and procedural protections for privacy and civil liberties. </P>
<P>The US has already negotiated such an agreement with the United Kingdom and Australia. Now it’s Canada’s turn.</P>
<P>This will be welcome news to Canadian law enforcement, who regularly seek evidence from US-based technology companies but sometimes find themselves hampered by a number of factors. In fact, Canadian law enforcement lobbying groups like the Canadian Association of Chiefs of Police have been pushing hard to get Canada to negotiate a CLOUD Act agreement with the United States.</P>
<h1>Mutual Legal Assistance</h1>
<P>There has for some time been a mutual legal assistance treaty between Canada and the United States, which provides a government-to-government pathway for law enforcement in Canada to obtain access to information in the United States. It is a two-way street, which similarly provides American law enforcement with access to Canadian data.</P>
<P>Without an agreement like the MLAT, carrying out searches on foreign territory violates international law and sovereignty. </P>
<P>The mutual legal assistance process has been said to be cumbersome and time-consuming, mainly because all requests from Canadian law enforcement are routed through the department of Justice Canada in Ottawa, who then sends a request to the United States Department of Justice. Both of these entities review the request and there is an element of discretion on the part of the receiving government as to whether or not they wish to process it. Assuming it is OK with the Canadian and US central authorities, a lawyer from the US Department of Justice seeks an order from the United States Federal Court that is addressed to the service provider, requiring them to provide the data to the US DOJ, which then sends the data to the Canadian DOJ and then to the law enforcement agency.</P>
<P>A key part of this process is the review and approval by the central authorities in each country. They ask “does this fit within the treaty?” “Does it meet the legal thresholds?” “Is it appropriately tailored – not too broad?” “Is it consistent with our laws and values?” “Does it implicate any of our own domestic interests?” </P>
<P>Canadian law enforcement generally would prefer to avoid this, and have tried to do so by seeking production orders in Canadian courts that name US based service providers. </P>
<P>The Canadian Criminal Code does not authorize the service of production orders outside of Canada, mainly because a Canadian court does not have jurisdiction over someone who is not in Canada. Some Courts simply will not issue these orders, but more are issuing these sorts of orders after a decision from the British Columbia Court of appeal called Brecknell. For a bunch of reasons, I think that decision is wrongly decided but for more information on that you can <a href="https://digitalcommons.schulichlaw.dal.ca/cjlt/vol18/iss1/5/" target="_blank">read my case comment</a>.</P>
<P>In my experience, most US service providers will provide data in response to Canadian Court orders, but they are prohibited under US criminal law from providing the content of any communications except with a qualifying US warrant. That can be obtained through the MLAT process, but a “qualifying US warrant” is not available from a Canadian court.</P>
<P>A few years ago, I was involved in a case on behalf of an American company where a Canadian law enforcement agency sought and obtained a production order that would have required the US company to violate American law. The case ultimately became moot before it went to a hearing, so there's no written decision I can point you to. But it was clear that the attempt to do so was out of frustration with the mutual legal assistance process and the perception of the time it takes. In reality, urgent orders can be turned around quite quickly and the average turnaround time is around 2 months.</P>
<P>The process we have ahead likely looks like this: it will take some time to negotiate the agreement between Canada and the US. It is not “one size fits all”. Once the agreement is negotiated, it will have to go to the US congress – a process that is at least six months. And Canada would have to amend a bunch of laws before it can go into effect. </P>
<h1>What to expect</h1>
<P>So what would implementing a CLOUD Act agreement look like on the Canadian side of the border? I would only be speculating, because we don't have a final agreement to look at, but a number of laws would have to be amended.</P>
<P>For example, all of our existing privacy laws in Canada prohibit the disclosure of personal information or personal health information except to comply with a warrant, production order, court order or where required by law. Currently, that would be read as we're required by Canadian federal or provincial law. Or under a Canadian court order. </P>
<P>Complying with a US order would not fit within that. Those barriers would need to be taken down, or a new law would need to be passed so that these American orders could be complied with in Canada. </P>
<P>I don't think making US orders mandatory in Canada is how it would likely play out. On the American side of the border, the CLOUD Act does not make foreign orders mandatory in the United states. What it did was take down the barriers, mainly in the Stored Communications Act, that prevented US-based companies from disclosing certain categories of information. In order to be truly reciprocal, Canadian laws would need to be amended to permit disclosures to US law enforcement in response to a US court order or subpoena.</P>
<P>This is where I think things will get a little bit controversial in Canada. After all, two provinces went so far as to prohibit personal information from being stored outside of Canada or being accessed from outside of Canada because of an overblown concern about the USA PATRIOT act. In some instances, it is an offense to disclose personal information in response to a “foreign demand for disclosure”. All that would have to change, and I think that will attract some interesting responses.</P>
<P>At the end of the day, it makes sense that Canadian police should be able to go to a Canadian judge to get an order for access to information about Canadian suspects of a crime that took place in Canada. </P>
<P>It also makes sense that American police should be able to go to an American judge to get an order for access to information about American suspects of a crime that took place in the US. </P>
<P>The CLOUD Act agreements with the UK and Australia provide some idea about the guardrails that should be included in an agreement with Canada. </P>
<P>First, it should be limited to serious crimes and not triviality or just administrative and regulatory tribunals. </P>
<P>Second, it should not permit one country to investigate the citizens or residents of the other country. It should be limited to Canadian authorities investigating Canadian crimes, or American authorities investigating American crimes. </P>
<P>Third, there would be a mechanism by which either country gets to say for a particular request that the agreement would not apply in that instance. </P>
<P>Fourth, there should be a mechanism by which a company that receives a legal process to challenge it.</P>
<P>As a final note, when this progresses and we see what the agreement looks like, Canadians should be very careful to make sure that it is not used to further the Canadian so-called “lawful access” agenda that has been pursued for years and years by Canadian law enforcement. In particular, Canadian law enforcement have been trying to get the laws amended so they can get warrantless access to personal information.</p>
<div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-43739324761975406352022-03-21T08:00:00.005-03:002022-03-21T09:00:55.905-03:00Video: Privacy laws and the media (Part 1)<P><iframe width="640" height="480" src="https://www.youtube.com/embed/HrENVUXjSQo" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<p>
<hr>
<p>Today, I’m going to be talking about privacy rights and freedom of expression in Canada. Specifically, I’m going to be talking about privacy and news reporting. </p>
<p>This is a pretty big topic that could fill an entire course at both law school and journalism school, but I’m hoping to provide an overview of the significant laws and principles at play. </p>
<h1>Charter</h1>
<p>Most of us would be familiar with the idea of freedom of expression or freedom of the press. </p>
<p>In Canada, it is guaranteed in section 2(b) of our Charter of Rights and Freedoms under the heading of “Fundamental Freedoms”. </p>
<p>This section reads: </p>
<p>“Everyone has the following fundamental freedoms: (b) freedom of thought, belief, opinion and expression, including freedom of the press and other media of communication;”</p>
<p>In Canada, we regularly talk about freedom of expression, which is guaranteed to everyone. It does include “freedom of the press.”</p>
<h1>Charter s. 1</h1>
<p>In understanding how section 2(b) works, we also have to understand that it is not absolute. The freedom of expression guarantee is subject to section 1 of the Charter, which allows some limitations on Charter guaranteed rights. </p>
<p>Section one says: </p>
<blockquote>“The Canadian Charter of Rights and Freedoms guarantees the rights and freedoms set out in it subject only to such reasonable limits prescribed by law as can be demonstrably justified in a free and democratic society.”</blockquote>
<p>Let’s break that down. Charter guaranteed rights can be subject only to “reasonable limits”, that are “prescribed by law” that have to be demonstrably justified in a free and democratic society.</p>
<p>It is always up to the government to justify these limitations. </p>
<p>It is important to note that freedom of expression not only includes the right to express oneself, but the courts have found that it includes a right to receive information. Limiting a journalist’s right to report on something also limits the public’s right to receive that reporting. </p>
<h1>The Oakes test</h1>
<p>The Supreme Court of Canada has given us the test for how to determine if an infringement of a Charter right can be justified under section 1. This is called the Oakes test, from a 1986 decision of the Supreme Court. </p>
<p>This also could be its own law school course, but in summary here it is:</p>
<p>First the limitation has to be “prescribed by law”. That’s right from section one. It can be a federal or provincial statute. It can be a regulation or a by-law. But it can’t be a whim of a state actor. It has to be rooted in the law. In some cases, the law could be so vague that it does not qualify as prescribed by law. </p>
<p>Second, the objective of the law has to be pressing and substantial. The courts will not permit Charter rights to be infringed for trivial objectives, so the law has to be for an important purpose. </p>
<p>Third, the impact on the Charter right has to be proportional. This has three parts:</p>
<p>The means chosen by the legislature to address these objectives must be rationally connected to the objective. </p>
<p>In doing so, the measures need to minimally intrude on the impairment of the rights at issue.</p>
<p>Finally, there must be proportionality between the infringement and objective. This is a final balancing step. </p>
<p>In order for an infringement of a Charter right to be justified, the government has to satisfy all parts of this test. If it fails one part, its justification fails. </p>
<h1>The Common law</h1>
<p>The Oakes test is only used for limitations that are prescribed by law, and something different is done for the common law. The common law is that substantial portion of our laws that are judge made and a bit more fluid. </p>
<p>Many of the privacy claims that I’ll be talking about are “common law”, including “intrusion upon seclusion” and “public disclosure of private facts”. These aren’t subject, strictly speaking, to the Charter. </p>
<p>The Charter limits what governments can do, how our parliament can legislate. The Common law isn’t generally a government imposing limits on what people can do, but most usually regulate what legal claims one person can have against another.</p>
<p>But the Supreme Court has said that the Common law needs to evolve in line with Charter principles and Charter values. For example, in a 2009 case called Grant v Torstar, the Supreme Court of Canada said that the common law of defamation needed to include a defence of “responsible communication on a matter of public interest” to take into account freedom of expression. </p>
<p>The protection of reputation was an important value that had to be balanced against the important right of freedom of expression. </p>
<h1>Privacy statutes</h1>
<p>So, is the press subject to privacy statutes like the federal Personal Information Protection and Electronic Documents Act or the BC and Alberta Personal Information Protection Acts?</p>
<p>Generally speaking, when engaged in journalism, they are not subject to these laws. </p>
<p>To do otherwise would be unworkable: journalists would have to get consent from politicians before reporting about them, whether it is favourable or critical. That would be a significant intrusion into freedom of expression. </p>
<p>As a result, all three of these laws specifically exclude all collection, use and disclosure that is exclusively for journalistic purposes. </p>
<p>Here is what PIPEDA says…</p>
<blockquote>4(2) This Part does not apply to …<br>
(c) any organization in respect of personal information that the organization collects, uses or discloses for journalistic, artistic or literary purposes and does not collect, use or disclose for any other purpose.</blockquote>
<h1>Alberta PIPA</h1>
<p>Here is what Alberta’s PIPA says …</p>
<blockquote>4(3) This Act does not apply to the following: …<br>
the collection, use or disclosure of personal information, other than personal employee information that is collected, used or disclosed pursuant to section 15, 18 or 21, if the collection, use or disclosure, as the case may be, is for journalistic purposes and for no other purpose;</blockquote>
<h1>Common law claims</h1>
<p>But journalists are subject to the common law, like defamation, and could be subject to common law privacy claims. </p>
<p>I am not aware of any cases where a journalist has been sued for “intrusion upon seclusion” or “public disclosure of embarrassing private facts” in Canada. If one were to be sued, the Court would have to take into account freedom of expression. </p>
<h1>Public disclosure of private facts</h1>
<p>This tort says that one who gives publicity to a matter concerning the private life of another is subject to liability to the other for invasion of privacy if the matter publicized or the act of the publication (a) would be highly offensive to a reasonable person and (b) is not of legitimate concern to the public.</p>
<p>Note it includes the “not of legitimate concern to the public.” So a lack of public interest is an important element of the tort, and proving public interest would overturn the claim. </p>
<h1>Intrusion upon seclusion</h1>
<p>In this tort, a person can sue another for an intentional (or reckless) intrusion into the private affairs of another without lawful justification, and that intrusion must be highly offensive to a reasonable person, causing distress, humiliation or anguish. </p>
<p>This tort was introduced into Canada in 2012 from the United States, and may be subject to some refining. It may well be that a court would have to read in the public interest factors that exist in the public disclosure tort in order to be consistent with the freedom of expression right in a case involving legitimate news reporting. Freedom of expression also includes the information gathering stage of reporting. </p>
<p>(You may have noticed that “public interest” came up in my discussion of the defamation defence created in Grant v Torstar and also in the public disclosure tort. Public interest in reporting is important.)</p>
<h1>Privacy Act (BC)</h1>
<p>Some provinces, like British Columbia, have statutory torts of invasion of privacy. They also use “public interest” to provide a defence. Here’s the wording from BC:</p>
<blockquote>1 (1) It is a tort, actionable without proof of damage, for a person, wilfully and without a claim of right, to violate the privacy of another.<br>
(2) The nature and degree of privacy to which a person is entitled in a situation or in relation to a matter is that which is reasonable in the circumstances, giving due regard to the lawful interests of others.</blockquote>
<p>Which could include news reporting. </p>
<blockquote>(3) In determining whether the act or conduct of a person is a violation of another's privacy, regard must be given to the nature, incidence and occasion of the act or conduct and to any domestic or other relationship between the parties.</blockquote>
<p>This last part would very likely take into account whether the intrusion were done by a journalist pursuing a story in the public interest. </p>
<p>The statute also specifically includes a defence for “Publications in the public interest or comment on a matter of public interest”. </p>
<p>But it is notable that this only extends to the publication, and not the collection of information leading to the publication. </p>
<h1>Interception</h1>
<p>We also have criminal laws that are designed to protect privacy. </p>
<p>For example, we have a wiretapping law that makes it an offence to intercept a private communications. It does not include a public interest defence and I suppose it could be challenged if a reporter was engaged in wiretapping or eavesdropping as part of a story. </p>
<p>But just because it could be challenged, doesn’t mean it would necessarily be successful. It may well be that a court would say that any restriction on freedom of expression is justified, and everyone’s interest in being free from having their conversations overheard or phones tapped outweighs any impact on freedom of expression. </p>
<h1>Voyeurism</h1>
<p>We also have an offence of voyeurism, which includes a specific “public good” defence, which reads: </p>
<blockquote>“(6) No person shall be convicted of an offence under this section if the acts that are alleged to constitute the offence serve the public good and do not extend beyond what serves the public good.”</blockquote>
<p>It is hard to imagine a hypothetical scenario where a member of the press may be engaged in voyeurism and to use the public good defence, but it is there. And the legitimate information of the public on a matter of public interest would arguably be for the public good. </p>
<h1>Conclusion</h1>
<p>In Canada, freedom of expression and freedom of the press are important values. They are rights that are baked into our constitution and all laws in Canada that affect expression or the ability of the media to do their jobs have to be justified. </p>
<p>This includes privacy laws, which may be engaged every time a reporter is looking into the private affairs or the private life of a subject. </p>
<p>Thankfully, to take account of freedom of the press, journalists and journalistic purposes are specifically excluded from the application of our general privacy laws, which require individual consent for all collection, use and disclosure of personal information. </p>
<p>So what we’re left with are the general rules in the common law and statutes that regulate very problematic intrusions into privacy. On one hand, we have the general common law and statutes related to invasions of privacy. While they haven’t been tested in the context of journalism, they do take freedom of the press into account. </p>
<p>Similarly, we have laws that criminalize wiretapping and voyeurism, which could be subject to challenge related to possible impacts on freedom of the press, but these guardrails are likely justifiable under section 1 of our Charter. </p>
<div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-61666945004260902972022-03-14T08:00:00.004-03:002022-03-14T09:48:35.257-03:00Video: Home surveillance cameras<P><iframe width="640" height="480" src="https://www.youtube.com/embed/9Ph-ZOujWDI" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<p>In my legal practice, I exclusively advise businesses on matters related to privacy and technology law. But I am sometimes asked by individuals about the use of home surveillance cameras. Because of advances in technology and low cost, they’re everywhere. The rise of home delivery has led to porch pirates who steal packages, and people want to deter that or to try to catch porch pirates in the act. </p>
<p>If you keep an eye out walking down a suburban street, you’ll often see them. Doorbell cameras are very popular, but so are other cameras.</p>
<p>The purpose of this discussion is to review the laws that do and do not apply to individuals who use these devices on their own private property. At least in this discussion, I’m not going to talk about the laws as they may apply to companies that provide these services used by individuals. </p>
<H1>Different rules</H1>
<p>Many people are familiar with privacy regulations like the Personal Information Protection and Electronic Documents Act or the provincial Freedom of Information and Protection of Privacy Acts. </p>
<p>Businesses are regulated by commercial privacy laws, whether federal or provincial. </p>
<p>Government and police are regulated by public sector privacy laws.</p>
<p>But the personal and “domestic” collection of personal information is unregulated in Canada.</p>
<H1>General privacy regulations do not apply</H1>
<p>Commercial privacy regulations do not apply to private individuals collecting, using or disclosing personal information for their own personal purposes. </p>
<p>For example, the Personal Information Protection and Electronic Documents Act, known as PIPEDA, only applies to the collection, use and disclosure of personal information in the course of commercial activity. </p>
<p>And just to be more clear, paragraph 4(2)(b) of that Act excludes personal or domestic purposes:</p>
<p>It says This Part does not apply to … </p>
<blockquote>(b) any individual in respect of personal information that the individual collects, uses or discloses for personal or domestic purposes and does not collect, use or disclose for any other purpose; </blockquote>
<p>If you are collecting personal information – which includes video and images that include a person – only for personal or domestic purposes, that is excluded from the Act. </p>
<p>The Personal Information Acts of British Columbia and Alberta are very similar. </p>
<p>For example, paragraph 3(2)(a) has an exclusion that is very similar to PIPEDA’s.</p>
<blockquote>“This Act does not apply to the following: (a) the collection, use or disclosure of personal information, if the collection, use or disclosure is for the personal or domestic purposes of the individual who is collecting, using or disclosing the personal information and for no other purpose;”</blockquote>
<H1>Other “Privacy” laws</H1>
<p>Just because this activity is not captured by our general privacy laws, other laws may apply. </p>
<p>Our Criminal Code includes offences for voyeurism and the interception of private communications.</p>
<H1>Voyeurism</H1>
<p>The crime of voyeurism was added to the Criminal Code relatively recently. </p>
<p>It involves surreptitiously observing or recording a person where there is a reasonable expectation of privacy. </p>
<p>Paragraph (a) makes it an offence to observe or record in a place in which a person can reasonably be expected to be nude … or to be engaged in explicit sexual activity.</p>
<p>Paragraph (b) makes it an offence where the recording or observing is done for the purpose of observing or recording a person in such a state or engaged in such an activity.</p>
<p>Paragraph (c) covers a broader range of observation or recording, but where it is done for a sexual purpose. </p>
<p>People should be aware that the courts have held you can have a reasonable expectation of privacy in a relatively public place and that the expectation of privacy can vary according to the method of observation. For example, you may not have much of an expectation of privacy with regard to being observed by someone at eye level, but you may have a protected expectation of privacy from being observed or recorded up a person’s dress or from above to look down their top. </p>
<p>Don’t point a camera where someone has a reasonable expectation of privacy.</p>
<p>This would include pointing at a neighbour’s windows, fenced back yards, pool, hot tub, etc.</p>
<H1>Interception of private communications</H1>
<p>Audio recording is particularly hazardous in Canada. </p>
<p>Using a device to knowingly intercept a private communication can be a very serious offence in Canada. </p>
<p>If your camera can record audio, don’t put it where it might record a private communication or disable that feature. And be careful. </p>
<p>You may have a camera on your fence-post that is exclusively pointed at your property, but it may capture private conversations among your neighbours on the other side of the fence.</p>
<p>Consent is a defence to a charge under this section, but it’s unclear if signage can create adequate consent.</p>
<H1>Other privacy laws</H1>
<p>In addition to the criminal law, people should also be mindful of the laws where you can be sued. </p>
<p>This includes the law of nuisance, the law of trespass, and privacy claims under “intrusion upon seclusion” and some provincial privacy statutes. </p>
<H1>Nuisance</H1>
<p>Nuisance is a very old, and well established legal claim. It boils down to “unreasonable interference with the ordinary enjoyment of property.” </p>
<p>A lot of traditional, old nuisance claims relate to noises, bad smells, smoke and things like that, but we are starting to see cases where people claim that someone’s use of surveillance cameras is interfering with their enjoyment of their own property. </p>
<p>The case of <a href="https://www.canlii.org/en/bc/bcsc/doc/2009/2009bcsc1403/2009bcsc1403.html" target="_blank">Suzuki and Monroe</a> from the British Columbia Supreme Court in 2009 is instructive. </p>
<p>In this case, the Suzukis sued the Monroes for having a loud air conditioner and for having a surveillance camera that included part of the Suzuki property. In finding in favour of the plaintiffs, the judge wrote:</p>
<blockquote>“I have no doubt that a surveillance camera continuously observing the entrance areas to a neighbouring property, or any part thereof, in these circumstances, is an intolerable interference with the use and enjoyment of the neighbouring property…
<p>No useful purpose of any kind is served by having the camera directed at any part of the Suzuki property. </p>
<p>I am forced to conclude that the Munroes installed the camera and refused to remove or redirect it at least in part in order to provoke and annoy the Suzukis.</p>
<p>Acts done with the intention of annoying a neighbour and actually causing annoyance will be a nuisance, although the same amount of annoyance would not be a nuisance if done in the ordinary and reasonable use of the property….”</blockquote>
<p>It is important to note that the judge said the use of cameras was not really necessary for any legitimate purposes of the defendants. If it had been legit, it might not have been found to be a nuisance.</p>
<p>We’ll talk about another, similar BC case in a bit.</p>
<H1>Trespassing</H1>
<p>Trespassing is unlawful. It can be a criminal offence, a provincial offence or someting you can sue someone for. </p>
<p>Don’t enter a neigbhour’s property to install or locate a camera without their permission. Putting a camera physically on a property that is not yours without permission is also unlawful. </p>
<H1>Intrusion upon seclusion</H1>
<p>In addition to the more traditional torts that I just mentioned, we are seeing more pure privacy claims. </p>
<p>In most common law provinces, you can sue or be sued for “intrusion upon seclusion”. </p>
<p>It is, in summary “an intentional or reckless intrusion, without lawful justification, into the plaintiff's private affairs or concerns that would be highly offensive to a reasonable person.”</p>
<p>If you poke into someone’s private life in a way that would be highly offensive, harm and damages are presumed. </p>
<H1>Statutory torts</H1>
<p>Some provinces have what are called statutory torts of invasion of privacy. </p>
<p>Here is the gist of the British Columbia Privacy Act. </p>
<blockquote>1(1) It is a tort, actionable without proof of damage, for a person, wilfully and without a claim of right, to violate the privacy of another.</blockquote>
<p>This means that the plaintiff doesn’t have to prove they were actually harmed. That is presumed.</p>
<p>Note the violation has to be without a claim of right or legitimate justification. </p>
<p>It then goes on and says …</p>
<blockquote>(2) The nature and degree of privacy to which a person is entitled in a situation or in relation to a matter is that which is reasonable in the circumstances, giving due regard to the lawful interests of others.
<p>(3) In determining whether the act or conduct of a person is a violation of another's privacy, regard must be given to the nature, incidence and occasion of the act or conduct and to any domestic or other relationship between the parties.</blockquote>
<p>Note it specifically refers to eavesdropping and surveillance in subsection (4), which reads:</p>
<blockquote>(4) Without limiting subsections (1) to (3), privacy may be violated by eavesdropping or surveillance, whether or not accomplished by trespass.</blockquote>
<p>For the use of home surveillance cameras to protect your private property, paragraph 2(2)(b) is important:</p>
<blockquote>2(2) An act or conduct is not a violation of privacy if any of the following applies:
<p>(b) the act or conduct was incidental to the exercise of a lawful right of defence of person or property; …</blockquote>
<p>Let’s see how that plays out in practice.</p>
<p>This specifically came up in another British Columbia case called<a href="https://www.canlii.org/en/bc/bcsc/doc/2021/2021bcsc1640/2021bcsc1640.html" target="_blank"> Minicucci and Liu</a>, a 2021 decision from the British Columbia Supreme Court. </p>
<p>This was another dispute between neighbours. </p>
<p>For backyard privacy, the plaintiff planted eight 25-foot cedars and twenty 10-foot cedars along the property line. This is the property line between the parties’ homes. The plaintiffs had a pool in their backyard, and the defendants had one as well. </p>
<p>Sometime later, the defendants asked the plaintiffs to “top” the trees because they were interfering with the defendants’ view. The plaintiffs refused. </p>
<p>Sometime later, while the plaintiff was away from their home, the defendant topped numerous of the cedar trees. </p>
<p>The plaintiff installed cameras pointed at the trees, and the camera also could see into the defendant’s backyard. </p>
<p>So the plaintiff sued the defendants seeking damages and injunctive relief for trespass and damage to the cedar trees. </p>
<p>The defendants filed a counterclaim seeking damages from the plaintiff for nuisance and for invasion of privacy by the camera. </p>
<p>The defendant’s privacy claim was dismissed because the use and location of the cameras was justified. Capturing a portion of the defendant’s backyard was incidental and the camera had been installed because of the defendant’s trespass and topping their trees. </p>
<p>The court also noted that it would not have been possible to record the trees without incidentally including some of the backyard. </p>
<H1>Other rules - Condo rules</H1>
<p>In some cases, there may be other rules that affect whether or how someone can install surveillance cameras. </p>
<p>In a 2022 Alberta court case called <a href="https://www.canlii.org/en/ab/abqb/doc/2022/2022abqb65/2022abqb65.html" target="_blank">Lupuliak and Condo Plan 82111689</a>, the Court of Queen’s bench found against a condo owner because the installation of a doorbell camera on the person’s door violated the condo rules. A similar camera that had been installed on the person’s patio was found not to be an issue. </p>
<H1>Other rules - Leases</H1>
<p>If you’re a tenant, you would want to check your lease or check with your landlord before installing any device outside your leased space. This would also include your door. </p>
<H1>Purely public places</H1>
<p>Many cameras that people install to observe their front doors or driveways will also include coverage of public spaces like sidewalks and roads. </p>
<p>There’s a diminished expectation of privacy in a completely public space like a road or a sidewalk. </p>
<p>However, expectation of privacy is not binary but is more nuanced. </p>
<p>If it came up, the courts will likely do a balancing test: is your legitimate need to use the device proportionate to the intrusion for others?</p>
<H1>What if police ask for your footage?</H1>
<p>Since most people use home surveillance cameras to deter or detect criminal activity, it’s worth asking what to do if the police ask for your footage.</p>
<p>With the increasing adoption of the devices, police are more commonly doing a video or CCTV canvas as part of their investigations. This involves going around the area to see if there are any cameras that may have captured something that can further their investigation.</p>
<p>So if the police come knocking looking for footage from your camera, what should you do?</p>
<p>Unlike businesses that are subject to general privacy regulations, you can give them footage without a warrant or a court order. That doesn’t mean you have to. It’s entirely up to you, unless they have something called a production order, which requires you to provide it.</p>
<p>Personally, I would ask them what they’re investigating and I’d decide whether to hand it over on that basis.</p>
<p>And if you are dealing with the police to report a crime and your cameras captured anything relevant, you can feel free to hand it over. </p>
<H1>Best practices</H1>
<p>So at the end of the day, what are the best practices?</p>
<p>In short, don’t be an idiot. </p>
<p>Be a good neighbour and minimise any recording of anything that is not your own property. </p>
<p>Let people – residents and visitors – know what’s going on. Talk to your neighbours and put up signs. Your neighbour may actually appreciate that you have cameras. </p>
<p>Certainly, don’t point it at any place you’d expect people to be nude or doing “things”</p>
<p>Think about what you’re actually using the cameras for and adjust your settings accordingly. If you are concerned about prowlers at night or someone on your property when you’re not at home, some of these more advanced cameras can be set to only record at night or when you’re not at home.</p>
<H1>Takeaways</H1>
<p>Remember that though an individual in their private capacity is outside the usual privacy regulations, other laws and rules can still apply.</p>
<p>Respect your neighbours and their privacy interests</p>
<div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-58415800665239368692022-03-07T08:30:00.005-04:002022-03-07T08:33:27.024-04:00Video: Individual access requests under PIPEDA<p>New on my<a href="https://www.youtube.com/channel/UCK1JnZpphwTJZfsjW4Ng3mA/videos" target="_blank"> YouTube Channel</a>. <p>
<iframe width="640" height="480" src="https://www.youtube.com/embed/Gr40IiYsxYk" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<h1>Intro</h1>
<P>Today I am going to be speaking about individual personal information access requests. If you're from Europe, you probably have heard the term data subject access requests, which is essentially the same concept. </p>
<P>This is where an individual gets to ask a business what information they have about them, expects a copy of it and perhaps disputes its accuracy.</p>
<P>I remember when our federal privacy law was being debated and phased in, many businesses were concerned they would be overrun with individual access requests. They were particularly concerned with frivolous or vexatious ones. We really haven’t seen that in practice.</p>
<P>But the right exists and any organization that does business in Canada needs to know about it and should be able to manage it. </p>
<P>Today, I am only going to be talking about Canada's personal information protection and electronic documents act. This law includes a general rule that individuals have an access right. Like most rules, this is not absolute and there are some exceptions. I plan to cover many of these exceptions in this discussion.</p>
<P>While this discussion is limited to Canada's personal information protection and electronic documents act, you should probably know that every single Canadian privacy law includes an access right. </p>
<P>Most of our public sector laws are divided between freedom of information and protection of privacy. In the federal public sector, there is a separate Privacy Act and an Access to Information Act. Many provinces also have health privacy laws, all of which include an individual access right. </p>
<P>Though I am talking about the federal private sector law, you should know that some of the details can differ from law to law.</p>
<P>If you have followed any of these discussions, you will know that the Personal Information Protection and Electronic Documents Act is weird. This federal law is based on the general principles of the Canadian Standards Association Model Code for the Protection of Personal Information. In fact, this standard of Canada is appended as a schedule to the law. </p>
<P>If you read it, you will see that it is written as a general list of principles, not like most of our laws. The general rules are in the schedule but there are exceptions in the body of the statute. The body of the law and the Schedule have to be read together. </p>
<h1>The General Principle of Access</h1>
<P>So we will be looking at Principle 9 from the CSA Model Code and then sections 8 through 10 of the Act.</p>
<P>Of course, we have to start with the general rule of access. This is in Principle 9, entitled “individual access”. It says… </p>
<P>“Upon request, an individual shall be informed of the existence, use, and disclosure of his or her personal information and shall be given access to that information. An individual shall be able to challenge the accuracy and completeness of the information and have it amended as appropriate.”</p>
<P>This talks about access to the information itself. It also refers to access to information about how it has been used. And the individual also gets to challenge the accuracy and completeness of that information. </p>
<P>There are some sub-principles that elaborate on this. Sub principle 9.1 says…</p>
<blockquote>“9.1 Upon request, an organization shall inform an individual whether or not the organization holds personal information about the individual. Organizations are encouraged to indicate the source of this information. The organization shall allow the individual access to this information. … In addition, the organization shall provide an account of the use that has been made or is being made of this information and an account of the third parties to which it has been disclosed.”</blockquote>
<P>The business should answer the question about whether they even have information about the individual, and should be able to tell them where that information came from. </p>
<P>They also should be able to tell the individual how that information has been used and to whom it has been disclosed. Businesses are sometimes surprised to discover that they have to keep information about their information in order to satisfy this requirement.</p>
<P>Because a business cannot disclose personal information about somebody without their consent, and the information contained in an individual access request is pretty all-encompassing, it makes sense that the business can require the individual to prove that they are the person they purport to be. It also makes sense that the individual should cooperate in helping the business identify what information may be about them. </p>
<P>That includes “how do we know you are who you say you are?” And “where should we look to find information about you?”</p>
<P>Information provided in that particular context can only be used for that purpose.</p>
<h1>To whom has the information been disclosed?</h1>
<P>I mentioned that businesses have to keep information about their information. In sub-principle 9.3, individual access rights include a right to know to whom a person’s personal information may have been disclosed. The principle reads:</p>
<blockquote>“9.3 In providing an account of third parties to which it has disclosed personal information about an individual, an organization should attempt to be as specific as possible. When it is not possible to provide a list of the organizations to which it has actually disclosed information about an individual, the organization shall provide a list of organizations to which it may have disclosed information about the individual.”</blockquote>
<P>At the end of the day, organizations need to know where the data they control goes and need to be able to tell people when they ask.</p>
<h1>Timelines to respond</h1>
<P>The timelines to respond are a good example of the difference between the very general language of the principles and some of the specifics in the statute. The sub-principle 9. 3 says it has to be provided “within a reasonable time”. We’ll see when we flip to section 8 that that really means no later than 30 days in most cases. </p>
<P>The sub-principle also says it has to be at minimal or no cost to the individual. </p>
<P>My general advice is to not charge people for this. But there are cases where individuals will repeatedly make requests and there is no mechanism to say “no” to frivolous or vexatious requests. Attaching a cost may make sense. For example, in any twelve month period the first request is free.</p>
<P>I think Google had the right idea when it started providing users with the ability to download their account information. A self-serve individual access right. Since then, many large data driven companies have followed suit allowing individuals to easily access their own data for free.</p>
<P>This sub-principle also says “The requested information shall be provided or made available in a form that is generally understandable. For example, if the organization uses abbreviations or codes to record information, an explanation shall be provided.”</p>
<P>This makes sense. If a person can’t parse a JSON file or decipher technical abbreviations, the person really isn’t able to access the information. I know of some healthcare providers who will provide a nurse or a records clerk to walk through the records with a patient who asks for it.</p>
<P>Finally, you’ll note that this doesn’t go so far as to give a “data portability” right. We expect this to be added when PIPEDA is updated in the coming year or so. </p>
<h1>Disputes about accuracy</h1>
<P>PIPEDA contains an accuracy principle, which requires that “Personal information shall be as accurate, complete, and up-to-date as is necessary for the purposes for which it is to be used.”</p>
<P>The individual has the right to dispute the accuracy of any personal information a company may have, and sub-principles 9.5 and 9.6 address how this is to be dealt with. It is pretty straightforward:</p>
<blockquote>“9.5 When an individual successfully demonstrates the inaccuracy or incompleteness of personal information, the organization shall amend the information as required. Depending upon the nature of the information challenged, amendment involves the correction, deletion, or addition of information. Where appropriate, the amended information shall be transmitted to third parties having access to the information in question.”</blockquote>
<P>But what happens if the company doesn’t agree that the information is inaccurate? Sub-principle 9.6 addresses this:</p>
<blockquote>9.6 When a challenge is not resolved to the satisfaction of the individual, the substance of the unresolved challenge shall be recorded by the organization. When appropriate, the existence of the unresolved challenge shall be transmitted to third parties having access to the information in question.</blockquote>
<h1>How to make a request</h1>
<P>So those are the relevant provisions in the Schedule from the CSA Model Code. Let’s now turn to some of the specifics in the body of the statute itself. </p>
<P>Subsection (1) of Section 8 of PIPEDA says that these requests have to be in writing. This can, of course, be electronic. Note that the wording says “must”. This implies that a request that is not in writing doesn’t trigger the formalities of the Act, but can still be responded to. </p>
<h1>Duty to assist</h1>
<P>Subsection (2) of Section 8 places an obligation on the organization to assist the individual to make a request if they say they need help.</p>
<P>This makes sense. </p>
<h1>Timing</h1>
<P>I mentioned earlier that the general language about timing in the principles is firmed up in the body of the statute. Specifically, it says “An organization shall respond to a request with due diligence and in any case not later than thirty days after receipt of the request.”</p>
<h1>Extension of time limit</h1>
<P>This isn’t absolute, however. In some cases, the organization can extend the time but has to let the individual know about the extension, the reason for it and of their right to complain to the Privacy Commissioner. </p>
<P>The first circumstance is if “meeting the time limit would unreasonably interfere with the activities of the organization”. </p>
<P>This would be if the request is complex or would require a lot of resources, who would be taken away from their usual tasks and it would “unreasonably interfere with the activities of the organization.” What “unreasonably interfere” means is unclear. In this case, the timeline can be extended for a second thirty days. </p>
<P>The second circumstance is if the organization needs more time to carry out consultations necessary to respond to the request. For example, some of the information may have been generated in litigation or in contemplation of litigation, and the organization needs to determine if the privilege exception applies and to decide whether to waive it. In this case as well, the timeline can be extended for a second thirty days. </p>
<P>The third scenario is more open ended and allows time to convert the personal information into an alternative format. This may be to accommodate a disability. </p>
<h1>Deemed refusal</h1>
<P>Subsection (5) of Section 8 says that if the organization fails to respond to an access request within the timelines imposed by the Act, that is a deemed refusal and the individual thus has the right to complain to the Privacy Commissioner. </p>
<h1>Costs for responding</h1>
<P>You’ll recall that the principles say that access requests have to be “at minimal or no cost to the individual.”</p>
<P>Subsection (6) of Section 8 says that you can only charge the individual if they are advised of the approximate cost and the individual then tells the organization that the request is not being withdrawn. </p>
<P>Notably, there is no other guidance on costs or whether the cost has to be reasonable. That’s likely implied. </p>
<h1>Reasons for refusals</h1>
<P>If the organization refuses an individual’s request – and I’ll get into the exceptions that can justify a refusal shortly – this refusal has to be in writing. It has to tell them the reasons for the refusal and to tell them they have the right to complain to the Privacy Commissioner. </p>
<P>It also says that the organization essentially must preserve and retain the information at issue for as long as is necessary to allow the individual to exhaust any recourse that they may have.</p>
<P>That makes sense. If it was an unjustified refusal, and the end result is a recommendation from the Commissioner or an order from the court to hand it over, that would be thwarted if the information were deleted in the meantime. </p>
<h1>Mandatory refusals</h1>
<P>The Act contains a number of circumstances where access either can be refused or where it must be refused. </p>
<P>In subsection (1) of section 9, it says that you have to refuse to provide access if doing so would disclose personal information of a third party. If that personal information can be severed from the disclosure, then you must do the severing and provide the balance of the information. If the third party consents, then access can be granted. </p>
<P>Interestingly, subsection (2) allows giving access even if it would disclose third party personal information if the “individual needs the information because an individual’s life, health or security is threatened.” </p>
<P>Notably, it is not just if the applicant’s life health or security is threatened</p>
<P>That is a real outlier of a scenario and if you encounter that, get immediate advice from an experienced privacy lawyer. </p>
<P>A second scenario where access must be refused is if the personal information that is the subject of the access request has previously been requested by law enforcement, national security or other government agencies. If this is the case: get immediate advice from an experienced privacy lawyer. </p>
<P>The Act sets out a whole routine of consulting with the government agency, seeking their input or direction. If they say don’t disclose it, you can’t disclose it. And you probably can’t tell the individual why and you also have to give notice to the Privacy Commissioner. </p>
<P>The legislators have created a real minefield for organizations if this comes up, so proceed with caution and with good advice.</p>
<h1>Discretionary refusals</h1>
<P>Subsection (3) of Section 9 sets out a number of circumstances where an organization can choose to refuse access. It doesn’t have to provide it, but it can. </p>
<P>The first is if the information is protected by legal advice or litigation privilege. This comes up a lot because individuals often use the access right under PIPEDA as a pre-litigation discovery tool. If there’s any doubt about whether information fits in this category, seek advice. And of course be aware that this would amount to a waiver of privilege. </p>
<P>The second is if providing access would reveal confidential commercial information, but if that information can be severed, it has to be and the balance of the information must be provided. </p>
<P>The third is if disclosing the information could reasonably be expected to threaten the life or security of another individual. As with confidential commercial information, if that information can be severed, it has to be and the balance of the information must be provided. </p>
<P>The fourth is if the information was collected under paragraph 7(1)(b), which is if it was collected without the knowledge or consent of the individual in connection with an investigation related to a breach of an agreement or a contravention of the laws of Canada or a province. If you refuse on this basis, you have to notify the Privacy Commissioner and include in the notice to the individual whatever information that the Commissioner may specify. </p>
<P>The fifth is if the information was generated in the course of a formal dispute resolution process. This would be in addition to litigation privilege, referred to in paragraph (a). </p>
<P>The sixth scenario where access can be refused is if the information relates to an investigation under the Public Servants Disclosure Protection Act. This rarely arises. </p>
<h1>Conclusion</h1>
<P>At the end of the day, Canadians are generally not frequent users of the individual access right that they have in the Personal Information Protection and Electronic Documents Act. </p>
<P>But businesses need to understand that this right exists and should have processes and procedures to manage it. Hopefully this has provided information on the general rules that apply to this, and the exceptions to the general right of access. </p>
<P>Thank you very much for tuning in. If you have any comments on this video or any suggestions for topics you’d like to see covered in the future, please leave them in the comments below. </p>
<P>If you find this sort of content to be interesting or informative, please subscribe. If you also click the bell, you’ll be notified of new videos as they are posted. </p>
<P>
<div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-87687966410530188042022-02-28T08:00:00.002-04:002022-03-07T08:33:48.011-04:00Video: What are privacy policies for in Canada?<p>New on my<a href="https://www.youtube.com/channel/UCK1JnZpphwTJZfsjW4Ng3mA/videos" target="_blank"> YouTube Channel</a>. <p>
<iframe width="480" height="360" src="https://www.youtube.com/embed/4-6MdjNfmgE" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<p>Today I am going to be talking about what privacy policies are really for under Canadian privacy laws. </p>
<p>They are everywhere – on every website – seldom read. But their purpose in Canada is a little misunderstood. </p>
<p>I am going to limit this discussion to Canada’s current federal private sector privacy law, called the Personal Information Protection and Electronic Documents Act or PIPEDA. But most of my comments would be applicable for the “substantially similar” laws in British Columbia and Alberta. </p>
<p>I think most people who follow this sort of stuff know that Canadian private sector privacy law is based on consent – knowledgeable informed consent. There’s often an assumption that the “knowledgeable” and “informed” parts come from people reading privacy policies. </p>
<p>That’s not the way it usually works, however. I think we all know that people seldom read privacy policies. At least based on my own informal polling of my students, fewer people are actually reading privacy policies than ever before. </p>
<p>Let’s look at what the Act actually says about consent. To be informed consent, you have to look at principles 2 and 3 (which are taken from the Canadian Standards Association Model Code for the Protection of Personal Information).</p>
<h1>Getting Consent</h1>
<p>Principle 2 says <blockquote>“The purposes for which personal information is collected shall be identified by the organization at or before the time the information is collected.” </blockquote></p>
<p>It then goes on and says <blockquote>“The identified purposes should be specified at or before the time of collection … to the individual from whom the personal information is collected. Depending upon the way in which the information is collected, this can be done orally or in writing. An application form, for example, may give notice of the purposes.”</blockquote></p>
<p>It does not say that it should be simply set out in a privacy policy.</p>
<h1>Principle 3 – Consent</h1>
<p>Principle 3 is about consent. It says simply </p>
<blockquote>“The knowledge and consent of the individual are required for the collection, use, or disclosure of personal information, except where inappropriate.”</blockquote>
<p>We can ignore the “except where inappropriate” part because all the exceptions are enumerated in section 7 of the Act.</p>
<p>Principle 3 then goes on and says </p>
<blockquote>“The principle requires “knowledge and consent”. </p>
<p>Organizations shall make a reasonable effort to ensure that the individual is advised of the purposes for which the information will be used. </p>
<p>To make the consent meaningful, the purposes must be stated in such a manner that the individual can reasonably understand how the information will be used or disclosed.”</blockquote></p>
<p>Again, it does not say just throw it in the privacy policy. </p>
<p>So you’re really only confident that you have adequate consent if you are confident the individual has actually been apprised of the purposes for the collection, use or disclosure of their personal information. </p>
<p>In most cases, you can’t be confident that any particular visitor to your website has scrolled to the bottom and has even seen the link to a privacy policy, let alone clicked on one. </p>
<p>In some cases, however, you could use the privacy policy to “identify purposes”. That would be if you require a new visitor to or someone who is just creating a new account to read and acknowledge the privacy policy. In that case, you have made the effort to bring all the purposes to the user’s attention.</p>
<p>In other cases, you might give users clear notice that your privacy policy has been updated. </p>
<p>And either making them review it or at least telling them to do so.</p>
<p>So if a privacy policy in Canada isn’t for getting consent, what is it for?</p>
<h1>Principle 8 – Openness</h1>
<p>To find out, we have to flip forward to the 8th principle, entitled “Openness”. </p>
<p>Spoiler alert – privacy policies in Canada are about being open and transparent. They should also be where you go for answers to any privacy-related questions. </p>
<p>Let’s read Principle 8, starting with the main principle:</p>
<blockquote>“An organization shall make readily available to individuals specific information about its policies and practices relating to the management of personal information.”</blockquote></p>
<p>It doesn’t come right out and say “thou shalt have a privacy policy”, but it essentially means that. </p>
<p>Subprinciple 8.1 says: </p>
<blockquote> “Organizations shall be open about their policies and practices with respect to the management of personal information. Individuals shall be able to acquire information about an organization’s policies and practices without unreasonable effort. This information shall be made available in a form that is generally understandable.” </blockquote></p>
<p>Be open about what you do with personal information. Make it really easy to find and make it easy to understand. </p>
<p>There’s then a list of all the additional things that an organization must have in a privacy policy:</p>
<p>The information made available shall include</p>
<blockquote> (a) the name or title, and the address, of the person who is accountable for the organization’s policies and practices and to whom complaints or inquiries can be forwarded; </blockquote></p>
<p>This essentially means the contact information for the organization’s privacy officer. It doesn’t have to name them, but there has to be a way to reach that person if there are any complaints or any questions.</p>
<blockquote> (b) the means of gaining access to personal information held by the organization; </blockquote></p>
<p>In Canada, individuals have a right of access to their personal information, subject to some limitations. This means you have to let individuals know about this right and how to exercise it. </p>
<p>I’ll likely do a full video soon on data subject access rights in Canada. </p>
<blockquote> (c) a description of the type of personal information held by the organization, including a general account of its use; </blockquote></p>
<p>You have to say what information you collect and how you use it. </p>
<blockquote> (d) a copy of any brochures or other information that explain the organization’s policies, standards, or codes; and</blockquote></p>
<p>This essentially says you have to have a privacy policy to communicate all this information. </p>
<blockquote> (e) what personal information is made available to related organizations (e.g., subsidiaries). </blockquote></p>
<p>If you share information between related companies, you should call this out here. </p>
<p>Also, the Privacy Commissioner of Canada says that the privacy policy should include information on whether personal information is stored outside of Canada. </p>
<h1>Who reads privacy policies?</h1>
<p>In my experience, there are only three categories of readers.</p>
<p>Regulators, who want to make sure you have a mature privacy program.</p>
<p>People with questions about the handling of their personal information.</p>
<p>People with concerns or complaints about the handling of their personal information. </p>
<p>Privacy policies should be written with these audiences in mind.</p>
<p>So at the end of the day, what are privacy policies for? </p>
<p>At the very least, they are so you can say you’ve complied with Principle 8.</p>
<p>But what else? It should serve as a reference for anyone who has any questions or concerns about how an organization handles personal information. </p>
<p>Someone reading it should be able to get a handle on what information the organization collects, understand how it is used and know who to contact with any questions or concerns. </p>
<p>
<div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0tag:blogger.com,1999:blog-6273930.post-11176279343986255702022-02-22T23:47:00.003-04:002022-03-07T08:34:01.409-04:00Video: Cross-border data flows for Canada<p>New on my<a href="https://www.youtube.com/channel/UCK1JnZpphwTJZfsjW4Ng3mA/videos" target="_blank"> YouTube Channel</a>. <p>
<iframe width="480" height="360" src="https://www.youtube.com/embed/0_dL3pyTxi8" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<P>In today's video, I am going to talk about the mosaic of privacy laws that we have in Canada and what they have to say about cross border data transfers.</P>
<P>First, I will talk about public sector privacy laws with two particular examples coming from British Columbia and Nova Scotia.</P>
<P>Then I would be talking about Canada’s private sector privacy laws, in particular PIPEDA and the substantially similar laws in Alberta and British Columbia. I will also briefly discuss the new Quebec privacy statute.</P>
<P>Finally, I will touch on various provincial health privacy laws that also have provisions that relate to cross border data flows</P>
<H1>What Canadian privacy laws</H1>
<P>Canada is a federal country and jurisdiction as it relates to privacy is divided between the provinces and the federal government.</P>
<P>We also have three general varieties of privacy laws:</P>
<P>Those that regulate the collection, use and disclosure of personal information by the public sector – which includes governments, government agencies and other organizations like universities and school boards.</P>
<P>We have a separate category of privacy laws that regulate the private, non-government sector.</P>
<P>Because healthcare in Canada is a mix of public and private, a number of provinces have developed health privacy laws to ensure uniform treatment of personal health information regardless of whether it’s at a doctor’s office or in a hospital.</P>
<H1>Public sector privacy laws</H1>
<P>One area in Canada that does not have any gaps in privacy regulation is the public sector. Each federal, provincial and territorial jurisdiction has a public sector privacy law that regulates the collection, use and disclosure of personal information by government and government agencies.</P>
<P>One thing that they all have in common is an obligation to protect and safeguard all personal information against a range of risks, including unauthorized disclosure. Very few of them directly address cross border data flows.</P>
<H1>Privacy Act</H1>
<P>In the federal jurisdiction, we have the privacy act which regulates federal government institutions.</P>
<P>The privacy act does not address cross border transfers or disclosures of personal information.</P>
<P>Instead, the federal treasury board has created guidelines regarding outsourcing that effects personal information.</P>
<P>These guidelines do not prohibit this storage of personal information outside of Canada, but instead impose an assessment to determine whether in the circumstances it is appropriate to use a particular service that may result in personal information being stored outside of Canada or accessed from outside of Canada.</P>
<H1>FIPPA (British Columbia) </H1>
<P>In 2004, the British Columbia Freedom of Information and Protection of Privacy Act was amended to essentially prohibit the province’s government from allowing personal information to be stored outside of Canada or accessed from outside of Canada.</P>
<P>This was because of a large-scale union campaign that latched onto privacy and fear of the USA PATRIOT Act to oppose government outsourcing of IT services.</P>
<P>These prohibitions were finally removed in 2021, likely driven by the need of governments, universities and school boards to use more modern cloud technologies to support work from home during the pandemic.</P>
<P>The replacement provisions anticipate the government to pass regulations about cross-border data transfers, but we have not seen those yet.</P>
<H1>PIIDPA (Nova Scotia) </H1>
<P>In 2006, Nova Scotia followed British Columbia in strictly limiting cross-border data flows when it passed the Personal Information International Disclosure Protection Act, also known as “PIIDPA”.</P>
<P>What PIIDPA contains is a general prohibition against storage or access outside of Canada for public bodies in Nova Scotia. This includes public bodies in the health sector.</P>
<P>PIIDPA is not as draconian as the British Columbia law because it does permit the “head of the public body” to authorize the storage or access outside of Canada if it is for the public body’s necessary operations.</P>
<P>The public body also has to make a report of the decision to the minister of justice, which is then made public.</P>
<P>PIIDPA also imposes specific obligations on all service providers of public bodies.</P>
<H1>Foreign demands for disclosure</H1>
<P>The most significant – but maybe less known – obligation imposed on service providers relates to “foreign demands for disclosure”. These are warrants, subpoenas and court orders by a foreign authority for records, as long as there is a penalty for non-compliance.</P>
<P>It is unlawful for a service provider to provide the data, and the public body or its service provider must give written notice of the demand to the Nova Scotia Minister of Justice.</P>
<P>Then what? I don’t know. Presumably there would be some government-to-government communications.</P>
<P>Foreign demands under other laws</P>
<P>Every privacy law in Canada permits disclosures without consent where the disclosure is required by law. Some include examples like warrants, subpoenas, litigation document discovery and the like.</P>
<P>None of them specify “where required by CANADIAN law”, but that is a reasonable presumption.</P>
<P>These laws, other than PIIDPA, don’t make it an offense but it would still not be permitted. </P>
<P>But at the same time, the Office of the Privacy Commissioner of Canada has been clear that if information is stored outside of Canada, it becomes subject to the laws of the place where it is stored. That’s a risk that needs to be taken into account in any contracting decision.</P>
<H1>Private sector privacy laws</H1>
<P>For most of the private sector in Canada, there are no rules that prohibit cross-border data transfers but there are rules that come into play. </P>
<P>Each private sector privacy law requires that the original “controller” makes sure that there are adequate safeguards to protect personal information. </P>
<P>The original controller has to use contractual terms to make sure that any contractors implement those safeguards. </P>
<P>Jurisdiction may affect whether safeguards can be adequately assured.</P>
<P>Disclosures by the organization or its contractors in response to a “foreign demand for disclosure” may be unlawful. Any organization dealing with something like this should immediately seek experienced legal advice.</P>
<H1>Alberta’s Personal Information Protection Act</H1>
<P>Alberta’s Personal Information Protection Act specifically addresses giving people notice about cross-border data transfers.</P>
<P>Specifically, the law requires policies and procedures that include the countries in which the collection, use, disclosure or storage is occurring or may occur, and the purposes for which the service provider has been authorized to collect, use or disclose personal information for or on behalf of the organization.</P>
<P>Because this information has to be made available upon request, it should be included in an organization’s public-facing privacy policy. </P>
<P>The Privacy Commissioner of Canada recommends this as well for PIPEDA</P>
<H1>Quebec’s Bill 64</H1>
<P>In the past year, Quebec has significantly updated its private sector privacy law, including provisions that specifically address cross-border data transfers.</P>
<P>These new provisions come into effect on September 22, 2023.</P>
<P>When the Quebec provisions come into effect, they will require a process similar to a data transfer impact assessment under the European GDPR. </P>
<P>Before storing personal information outside of Quebec, the organization will need to carry out a privacy impact assessment, sometimes referred to as a PIA.</P>
<P>Then the organization will need to carry out an analysis of whether there will be “adequate” protection of the personal information when transferred outside of the province. </P>
<P>Finally, there needs to be a written agreement with the service provider that mitigates any risk identified in the PIA and ensures that personal information will be adequately protected. </P>
<H1>Health privacy laws</H1>
<P>Health privacy laws are a specific kind of privacy law in Canada, which cross over the private sector (doctors’ offices, pharmacies and physiotherapists) and the public sector (health authorities and public hospitals).</P>
<P>Most health privacy laws in Canada prohibit disclosures of personal health information outside of Canada unless there is consent from the individual. Some similarly prohibit disclosures outside of the province.</P>
<P>But most people who practice in this space, and some regulators I’ve spoken to, say that a transfer for processing is not a disclosure for the purposes of this prohibition.</P>
<H1>What’s the reality on the ground? </H1>
<P>Many people still believe that cross-border transfers are prohibited in Canada, which is likely the result of the publicity around the prohibitions added to the British Columbia public sector law years ago. </P>
<P>The only province that significantly limits cross-border transfers is Nova Scotia, for the public sector in that province. </P>
<P>We still see requests for proposals from both the public and the private sectors that require data residency in Canada. </P>
<P>When this happens in the public sector, this is likely in violation of international trade agreements.</P>
<div class="blogger-post-footer"><script type="text/javascript"><!--
google_ad_client = "pub-2534906746401214";
//728x15, created 12/29/07
google_ad_slot = "1518476471";
google_ad_width = 728;
google_ad_height = 15;
//--></script>
<script type="text/javascript"
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
</script></div>privacylawyerhttp://www.blogger.com/profile/03943567746055311435noreply@blogger.com0