Sunday, September 14, 2025

Recording conversations -- using AI gadgets and otherwise -- and the law in Canada


One of the most common questions I get is about recording conversations. Can you do it? Is it legal? And maybe just as importantly … is it a good idea?


The answer is … complicated. And sometimes, even if it’s legal, it can be hostile or problematic.


A quick production note: I started a privacy law blog in 2004, and then started a YouTube channel at the end of 2021. In order to make this as accessible across multiple media, I’ve started a podcast that takes the audio and makes it available via Apple Podcasts, Spotify and the others. If you’d like privacy content while in the car or mowing the lawn, just look for “privacylawyer” in your favourite podcast app.


Now back to recording conversations and the law in Canada … 


I’ll try to break it down.


Before we get into the traditional scenarios, let’s start with something very new: AI wearables.


You might have heard of something called the “Humane Pin”. The Humane AI Pin was a screenless, AI-powered wearable device designed by the American startup Humane. They somehow thought it could replace smartphones. After shipping in April 2024 to overwhelmingly negative reviews, Humane was acquired by HP, which discontinued the device's service in February 2025. Famously, Marques Brownlee - an incredibly influential YouTuber and product reviewer called it the worst product he’d reviewed. The Humane Pin flopped, but that wasn’t the end of “AI wearables.”





A more recent device is a thing called “Bee”. It’s a small wrist-worn gadget with microphones built in. The idea is kind of simple and a logical extension of a lot of what generative AI has to offer: You slap it on your wrist and it listens to what’s going on, it transcribes, and it helps you keep track of what’s said throughout your day. Think of it as a memory assistant. You can review conversations later, get reminders of “to-dos,” or even have it summarize meetings.





That sounds useful for productivity and accessibility. Imagine if English isn’t your first language, or if you’re hard of hearing, have a bad memory or if you simply want a perfect record of a complex meeting. 


I’ve had relatives dealing with dementia, and something like this could be helpful, assistive technology when memories are fading and failing. 


The catch is that they’re “always listening.” They’re not just catching your thoughts — they’re catching the people around you, likely without their knowledge. And that can raise privacy concerns.


Now, the law hasn’t changed because of gadgets like these. The same rules apply (which I’ll get into in greater detail): if you’re a party to the conversation, recording isn’t automatically illegal. But the scale and permanence are different. Instead of someone taking really detailed notes, now you have a verbatim transcript — stored in the cloud, maybe analyzed by AI, and potentially vulnerable to misuse or breach.


You may recall Google Glass, originally launched in 2014. It was pretty cool and likely ahead of its time. What caused privacy regulators heartburn was that it had an integrated camera. Though it was not recording all the time, the regulators really wanted it to have a red light on the front so that people around would at least be aware of whether it is recording. These new wearables are even less conspicuous and people whose voices can be captured likely have no knowledge that they’re being picked up.


Let’s dig into the law that applies to recording conversations in Canada, whether you do so on an old timey reel-to-reel recorder, your smartphone or an AI wearable. And these rules are the same whether you’re face-to-face, on a phone call or in a Teams meeting.


If we’re talking about conversations that begin and end in Canada, the first place to look is the Criminal Code of Canada. Part VI of the Code is actually titled “Invasion of Privacy,” and it makes it illegal to intercept a private communication unless you have authorization — like a warrant — or unless one of the legitimate parties to the conversation consents.


The Criminal Code makes it a hybrid offence (meaning that it can be prosecuted either as an indictable offence or a summary offence) to “knowingly intercept a private communication”. The maximum penalty is up to five years in prison. There’s a saving provision which says the offence does not apply to “a person who has the consent to intercept, express or implied, of the originator of the private communication or of the person intended by the originator thereof to receive it”.


This is often called “one-party consent.” In simple terms, if you’re part of the conversation, you can record it. But if you’re not part of the conversation, you can’t secretly bug the room, leave a phone recording on the table, and walk away. That would be illegal eavesdropping.


You’ll note that consent can be implied. I haven’t seen any cases on this point, but I’d think having a loud conversation in a public place within earshot of others may be “implied consent” for the conversation to be “intercepted.” But I would not want to be the test case.


While you might see CCTV surveillance cameras all over the place, they should NOT be recording audio. This would likely be illegal “interception of a private communication” and I don’t think signs like this one will get the requisite consent. Many consumer grade surveillance cameras that we’re now seeing all over the place also have a capability to record audio. If you’re using one of these cameras and they’re positioned where someone might be having a conversation, disable the audio collection. 





So, if you’re a lawful participant in the conversation, the Criminal Code is not triggered. But if it’s someone else’s conversation, you can’t intercept it or record it. 


But that’s not the end of the story. In Canada, we also have privacy laws: PIPEDA federally, plus provincial laws in Alberta, BC, and Quebec.


Here’s the key: these laws don’t apply to purely personal or domestic activities. So if you’re recording a conversation for your own memory, or for journalistic purposes, or to make a record of something for your own personal purposes, you’re not subject to PIPEDA when you’re doing that. The same applies for the provincial privacy laws of Alberta, BC and Quebec. Those laws generally apply to businesses and “organizations”.


But if you’re recording for commercial purposes — say, recording customer service calls — then privacy law kicks in. In those cases, you generally need to tell the person and get their consent. You’ll notice most companies start their customer service lines with: “This call may be recorded for quality assurance and record keeping purposes.” That’s why. The idea is that you’re on notice that it will be recorded and if you stay on the line, your consent to the recording is implied.


(Technically, the company has to list all the purposes for the recording and I think many are not doing a full job. For example, you can’t just say it’s for “quality assurance” purposes when you’re also keeping the recordings for record keeping purposes.)


And there’s more: even if a recording doesn’t violate the Criminal Code or privacy statutes, you may still face claims under provincial privacy torts, or common law actions for unreasonable invasion of privacy. This is a bit of a stretch for a conversation that the recorder is lawfully a part of, but I can certainly see a possible claim if the conversation was clearly of a private nature and the recording is made public.


Now let’s shift to the workplace. This is where the issue gets interesting — and frankly, tricky.


I was at a labour and employment law conference not long ago, and almost everyone in the room had a story about employees secretly recording conversations. Sometimes they recorded meetings with their supervisors, sometimes with colleagues. And in every anecdote I heard, it was a case where the other party to that conversation would not have agreed to the recording and people got really upset when the recording became known.


If the employee is a lawful party to the conversation, it’s not illegal under the Criminal Code. But does that make it okay? Not really.


Secretly recording a conversation is almost always seen as a hostile act. It signals distrust, it poisons the relationship, and it creates a “gotcha” culture.


Employers are within their rights to regulate this. I’ve heard of cases where an employee steps out of a meeting, but leaves their phone in the room, recording. The employee may be wondering if their colleagues talk about them when they’re not around. Well, that’s eavesdropping and a crime. If they secretly record meetings they’re attending, it may not be criminal — but it can still be problematic, and it may be against workplace policy. Employers should have policies about this. 


Beyond ordinary workplaces, I’ve advised hospitals and health authorities about audio recording. Doctors and psychologists often feel uneasy when patients pull out a recorder. It can feel adversarial.


But sometimes recording is legitimate — even helpful. I remember when my father was diagnosed with cancer, my mother took detailed notes at every doctor’s appointment. There was so much information and all of it was overwhelming. If smartphones had been as common then as they are now, I would have suggested that she record these conversations, just to make sure she captured all the important information in such a stressful moment.


I’ve also spoken with psychologists where patients wanted to record therapy sessions. At first, practitioners felt uneasy. But when we explored it, recording actually improved therapy in some cases: patients could revisit the conversation, reinforce insights, and strengthen the therapeutic relationship. Once this was understood, the psychologists were concerned about whether the patients would adequately protect the recordings of these very sensitive conversations. Once the client walks about, that’s not really on the psychologist, but they can talk to their clients about this. I think in this scenario, it’s important for everyone to be on the same page.


So it’s not always hostile. Sometimes it’s accommodation. Sometimes it’s simply practical.


There’s also a new one that’s come up a lot recently: AI-enabled recording and transcription services that are built into or added onto video calls. You’ve probably seen them in Zoom or Microsoft Teams — a little box pops up saying “Recording and transcription is on.” I’ve seen people send their little ai companions to calls that they can’t attend personally. 


These tools can be fantastic. They create a really good record of meetings, which can help with minutes, accountability, or accessibility — for example, if someone in the meeting is hard of hearing, or if English isn’t their first language. I’ve used automatic captions in a number of cases because it can be very helpful, and this is enabled by AI “interception.” Automatic transcription can also let people go back and confirm exactly what was said.


But they can also make people nervous. Suddenly, everything you say in a meeting is not just heard in the moment — it’s captured, stored, maybe even analyzed. That can change the vibe and how people participate.


It also creates a very detailed record that can be subject to discovery in litigation, which is its own risk.


From a legal standpoint, the rules haven’t really changed. If you’re part of the conversation, recording or transcribing isn’t illegal. In many ways, it’s not that different from someone taking very detailed and accurate notes. The real difference is scale and permanence: instead of one person’s notes, it’s a verbatim transcript that might live on a server indefinitely. It also creates a reliable record that is likely more credible in a hearing or a trial than any one person’s recollection or notes may be.


I think it’s a best practice for organizations to have a clear policy about the use of these tools. Decide when it’s appropriate, make sure everyone in the meeting knows what’s happening, and have rules around how those recordings and transcripts will be used, stored, and eventually deleted. I’m on the board of one volunteer organization, and it was decided that recording and AI transcription could be used but only to help the meeting’s secretary prepare the final minutes. Once the minutes were final, the recording and the transcript were deleted. The minutes are the official record.


And be careful about confidentiality. You may be fine with recording most of a meeting, but want to turn it off during any “in camera” period. And you’ll want to make sure that the recordings are securely stored in accord with the company’s records keeping policies. 


Before I wrap up, I’ll mention two additional scenarios that are related to the legal system itself. First, under the rules of professional conduct for lawyers in Canada, there are requirements for a lawyer to notify a client or another legal practitioner of their intent to record a conversation. Rule 7.2-3 from the Law Society of Ontario Rules of Professional Conduct says

“A lawyer shall not use any device to record a conversation between the lawyer and a client or another legal practitioner, even if lawful, without first informing the other person of the intention to do so.”

So this requires notice, not consent. Essentially, you can’t do it secretly. 


The second scenario related to the legal system is court hearings. As a general rule, you cannot record a court hearing without the permission of the presiding judge. I’ve been at hearings where reporters present are allowed to record, but the recordings can only be used to check the accuracy of their notes, and the recordings cannot be further disseminated or broadcast.


Monday, September 08, 2025

Privacylawyer content now available as a podcast

I'm a longtime podcast listener and I watch a lot of YouTube. For some time, I've wanted to be sure that anyone who may be interested in my original content can get it wherever they want it. (That's one reason why I generally post the text of my YouTube videos here on the blog. Some people like to read words rather than watch a talking head.

From now on, my YouTube content will also be available as a podcast so  you can just subscribe in your podcast app of choice. 

The standalone page for the podcast can be found here: Privacylawyer - Canadian privacy and technology law with David Fraser.


Ontario privacy finding: Hidden biometrics in on-campus vending machines


On August 27, 2025, the Information and Privacy Commissioner of Ontario released a revised finding against the University of Waterloo. The initial report was issued in June this year and I should have done an episode on it then. The case involved what looked like a pretty ordinary thing on campus — vending machines. Except these weren’t just any vending machines. They were “intelligent vending machines,” installed by a third-party service provider, and they secretly used biometric face detection technology.


That sounds creepy and the University was found to have violated Ontario’s public sector privacy law. It’s not as cut and dried, but there are some interesting takeaways from that decision. 


Nobody on campus was aware that these vending machines use face detection technology until one of the machines malfunctioned and flashed an error message on its screen — basically outing itself as running “FacialRecognition.App.exe.” Understandably, students complained. It got a lot of media coverage and some buzz on Reddit.


Photo of a display showing an error message



The Information and Privacy Commissioner of Ontario investigated.


At the outset, the University of Waterloo challenged whether the Commissioner even had jurisdiction here. The University argued that this wasn’t really about Ontario’s Freedom of Information and Protection of Privacy Act — instead, they said it was governed by the federal Personal Information Protection and Electronic Documents Act or PIPEDA. Their reasoning? Selling snacks through vending machines is a commercial activity. And PIPEDA applies to the collection, use and disclosure of personal information in the course of commercial activity. And that meant the federal law applied, not the provincial law.


They also argued that if the vending machines didn’t actually capture personal information — as the manufacturer claimed — then there was nothing for the Commissioner to investigate. And finally, Waterloo tried to limit its responsibility by pointing out that it never contracted for biometric collection in the first place. In their view, if the vendor went off and deployed face detection technology, that wasn’t for them, they didn’t ask for it and they should not be on the hook for it.


The Commissioner rejected all of those jurisdictional arguments. The decision emphasized that under FIPPA, Ontario institutions like universities are responsible for personal information collected by vendors operating on their behalf — even when those vendors are engaged in activities with a commercial character. The Commissioner leaned on the “double aspect” doctrine in our constitutional jurisprudence: both federal and provincial laws can apply at the same time. In other words, even if PIPEDA could cover some of the activity, that doesn’t oust FIPPA.


So the bottom line on the jurisdiction question was that the University of Waterloo couldn’t escape the Commissioner’s oversight just by pointing to federal law or saying “we didn’t know.” Once personal information was being collected on its campus by machines it authorized, the University was on the hook under FIPPA


On the merits, the Commissioner concluded that the machines were capturing facial images, even if only for milliseconds. Not surprisingly, these facial images qualify as “personal information” under Ontario’s Freedom of Information and Protection of Privacy Act (FIPPA).


The collection wasn’t authorized by law, wasn’t necessary for selling chips and chocolate bars, and no notice was given.


Therefore, in the IPC’s view, Waterloo had violated FIPPA.


In order to find Waterloo at fault, or in violation of FIPPA, the IPC asks and answers three questions:


The IPC asked: “Did Waterloo “collect” personal information?” The Commissioner said yes. Even though the vendor claimed the system only processed images in real time, the machines captured full facial images in memory to estimate age and gender. That’s enough to count as a collection of personal information.


But really? Was it really Waterloo who “collected” personal information? Legally, yes. They had a vendor who was supplying goods and services on their behalf and the University is responsible for that. 


Then the IPC asked: “Was the collection compliant with FIPPA?” No. Section 38(2) of FIPPA says you can only collect personal information if it’s expressly authorized, needed for law enforcement, or necessary to carry out a lawful activity. Selling snacks doesn’t need biometric data. It might be “helpful” for marketing — but helpful isn’t the same as “necessary.” And also, no notice was given that personal information was being collected and why.


Finally, the IPC asked: “Did Waterloo have reasonable measures to protect personal information?” The Commissioner said they had decent contract clauses, but they fell down in procurement. They didn’t do the privacy risk assessment that could have flagged the biometric capability. That failure meant they didn’t exercise enough due diligence, and so they’re responsible.


Here’s where I think the finding is problematic. Waterloo had no knowledge of the biometric functionality. They weren’t using it, they didn’t ask for it, and their contract didn’t mention it. The vendor who responded to the RFP for vending machines apparently wasn’t aware of this functionality in some of the machines they provided. That other supplier embedded this capability, and at the time nobody was aware of it.


Due diligence usually asks the question with reference to what a reasonably prudent person would have done in the same circumstances. Without the benefit of hindsight, I think the University met that standard. But they could have done better, so the University is still on the hook for a privacy violation. It seems to be holding them to a higher standard, based on what we know now. 


It could have been enough to just give them a gentle slap upside the head, saying it’s 2025 and we need to assume that anything that uses electricity – and particularly if it’s a “connected device” – has the potential to collect personal information. You need to check. Even vending machines. 


Think about what this means in practice:


Does every university, hospital, or government office now need to disassemble or reverse-engineer every piece of technology it procures? Almost. 


Do they need to anticipate hidden biometric features in a vending machine?


Or test for surveillance capabilities in every piece of software?


That’s a pretty heavy burden — one that goes far beyond what most organizations reasonably do. I guess the standard for reasonable diligence has to be raised.


Yes, we want institutions to take privacy seriously. Yes, procurement processes should involve risk assessments. But here, it feels like the University is being faulted for not uncovering something that was essentially hidden. I’m not sure we can fault them for not asking at the time whether a vending machine used biometrics. We know now, but I don’t think they should be expected to have known to ask back then. 


While the vendor was not in the cross-hairs of the IPC’s investigation, vendors need to be mindful. If you build a product with biometric capabilities, you should have to disclose it — clearly and up front. If it’s an “internet of things” connected thing, it should be clearly identified as such. There probably is a boilerplate term in contracts that put the vendor on the hook if they cause the customer to violate any applicable law.  


In the end, a finding of having violated FIPPA isn’t like a criminal charge. The IPC issued two recommendations, which the university agreed to implement. First was to review their policies to make sure that future collection of personal information complies with FIPPA. Second was to implement practices to carry out necessary due diligence to identify, assess and mitigate any potential risks to personal information throughout the entire procurement process, including during the planning, tendering, vendor selection, agreement management and termination phases.


There’s a lesson here for everyone: I guess it’s time to update all your procurement and vendor documentation to ask about any connected or biometric features. Ask detailed questions about every bit of gear being installed and fully understand their capabilities. And I’d include reps and warranties in my contacts allowing for the termination of agreements if there has been any misrepresentation about the possible collection of personal information. 


One thing also to note is I think this would have gone differently for the university if the vendor wasn’t the university’s service provider. As I mentioned before, the university is on the hook for all personal information collected by their service providers, whether they wanted the information collected in the first place. But if the university had structured the arrangement differently, they likely would have avoided that direct responsibility. For example, if the agreement was more like the bare rental of space for the placement of vending machines on campus, the element of custody or control of the data likely would not have been there. Imagine the university enters into a lease with Starbucks to put a coffee shop in the library atrium. In such a scenario, you wouldn’t really see the University as being responsible for Starbucks’ collection of personal information as part of the Starbucks Rewards loyalty program.  Or maybe the privacy commissioner would take a different view? I kind of hope not.


In any event, there are more than a few lessons to learn from this finding. 


Wednesday, July 16, 2025

Bill C-2 "Strong Borders Act" - Supporting Authorized Access to Information Act (Part 15)


On June 3, the new Canadian government tabled Bill C-2 in Parliament, called “An Act respecting certain measures relating to the security of the border between Canada and the United States and respecting other related security measures” but with a short title of the “Strong Borders Act”.

Once again, following in the footsteps of past conservative and liberal governments, it contains a trojan horse that revives what has come to be known as “Lawful Access”. I’m really getting tired of these sorts of bills. (See Canadian Privacy Law Blog: Past Canadian "lawful access" attempts, both by Liberal and Conservative governments.)

In my last episode, I discussed Part 14 of the Bill, which creates new law enforcement authorities to get customer information, either without a warrant or court order, or with an order but based on a very low standard.  In this episode, I’ll go over Part 15, which creates a standalone “Supporting Authorized Access to Information Act”. The government says this is simply to make sure that electronic service providers have the capacity and capability to “share information” with “authorized persons”. 

I think it goes beyond this. It is similar to Bill C-26 from the last Parliament, as it allows the government to dictate what technologies electronic service providers use. This time is to create the capability for law enforcement to plug into service providers’ systems. 


Throughout this discussion, I can’t help but be reminded that the US has had something similar in their laws, and the mandated intercept capabilities were used by Chinese hackers to get access to data. 


The "Salt Typhoon" hacking incident, attributed to a Chinese state-sponsored advanced persistent threat (APT) actor, came to light in late 2024 with revelations that the group had extensively compromised the computer systems of multiple major U.S. telecommunications companies. The stolen information included call and text message metadata, and in some high-profile instances, even audio recordings of phone calls belonging to government officials and political figures. 


A critical factor facilitating the Salt Typhoon incident was the very infrastructure put in place to comply with the Communications Assistance for Law Enforcement Act (CALEA). Enacted in 1994, CALEA mandates that telecommunications providers build "lawful intercept" capabilities into their networks to allow law enforcement and intelligence agencies to conduct court-authorized wiretaps. While intended for legitimate surveillance, these mandated "backdoors" created inherent vulnerabilities within the telecom networks. Salt Typhoon exploited these CALEA-mandated systems, effectively turning the tools designed for lawful access into pathways for unauthorized espionage. 


This is what’s coming to Canada … 


The Supporting Authorized Access to Information Act creates a framework in which the Government of Canada can require electronic service providers to facilitate law enforcement and intelligence services’ access to data and information. Much of its scope is left to regulations. The sweep of what entities can be in scope of the Bill if very broad by regulating “electronic service providers”:


electronic service provider means a person that, individually or as part of a group, provides an electronic service, including for the purpose of enabling communications, and that


(a) provides the service to persons in Canada; or

(b) carries on all or part of its business activities in Canada.‍ (fournisseur de services électroniques)


electronic service means a service, or a feature of a service, that involves the creation, recording, storage, processing, transmission, reception, emission or making available of information in electronic, digital or any other intangible form by an electronic, digital, magnetic, optical, biometric, acoustic or other technological means, or a combination of any such means.‍ (service électronique)


This is extremely broad, and would likely capture almost all communications services that provide any service to Canadians. It likely covers VPN – or virtual private network – providers as they provide a service that involves the transmission of information. This would also scope in text messages, emails, phone calls, voice over IP calls and video calls. 


The Act specifically will target “core providers”, who are “electronic service provider[s] belonging to a class of electronic service providers set out in the schedule.” In the version of the Bill tabled at first reading, the schedule is blank.  I guess “to be determined”, but I expect it’ll be all the major telcos and internet service providers in Canada. It may include the significant messaging providers, like Apple, WhatsApp, Microsoft Teams, Zoom and email providers like Microsoft, Apple, Google. 


It is very, very broad in its possible scope. 

Ministerial regulations for “core providers”

The Act, in s. 5(2), empowers the government to create regulations placing obligations on core providers which relate to intercept and access capabilities and includes the installation of devices, etc. on behalf of “authorized persons”. 


(a)  the development, implementation, assessment, testing and maintenance of operational and technical capabilities, including capabilities related to extracting and organizing information that is authorized to be accessed and to providing access to such information to authorized persons;


(b)  the installation, use, operation, management, assessment, testing and maintenance of any device, equipment or other thing that may enable an authorized person to access information; and


(c)  notices to be given to the Minister or other persons, including with respect to any capability referred to in paragraph (a) and any device, equipment or other thing referred to in paragraph (b).


Importantly, a core provider is not required to comply with a regulation “if compliance with that provision would require the provider to introduce a systemic vulnerability in electronic protections (defined as ‘authentication, encryption and any other prescribed type of data protection’) related to that service or prevent the provider from rectifying such a vulnerability.” This would permit a regulated core provider to refuse to install a backdoor or compromise encryption if that would create a systemic vulnerability. 


Core providers can apply for an exemption for a specified period of time, in order to have time to come into compliance. 

Orders directed to specific electronic service providers

Per s. 7, the Minister is able to issue orders to any electronic service provider, regardless of whether they are a core provider, along the lines of regulations authorized under s. 5(2) for a specified period of time. In making the order, the Minister must consider:


(a)  the benefits of the order to the administration of justice, in particular to investigations under the Criminal Code, and to the performance of duties and functions under the Canadian Security Intelligence Service Act;

(b)  whether complying with the order would be feasible for the electronic service provider;

(c)  the costs to be incurred by the electronic service provider to ensure compliance with the order;

(d)  the potential impact of the order on the persons to whom the electronic service provider provides services; and

(e)  any other factor that the Minister considers relevant.


The Minister, in their discretion, may provide compensation to offset some of the costs incurred in paragraph (c). Similar to compliance with regulations, an electronic service provider is not required to comply with a portion of an order that would “require the provider to introduce a systemic vulnerability in electronic protections related to that service or prevent the provider from rectifying such a vulnerability.”


The Minister is required to permit affected electronic service providers to make representations prior to issuing an order under s. 7. 

Obligations to assist

The Act contains a very broad and problematic obligation on all electronic service providers to provide all reasonable assistance to a range of persons to “permit the assessment or testing of any device, equipment or other thing that may enable an authorized person to access information.” The list of persons authorized to make this demand include the Minister, CSIS employees, police officers and civilian employees of a police force. 


There is no threshold and no limitation on this power. For example, there is no requirement for approval from the Minister or any other senior person. It does not have to be reasonably necessary for any purpose related to the Act. You could have a lineup of people from every municipal police department out the door of an electronic service provider, the they have to provide this unlimited and unbounded assistance. 

Prohibitions on disclosure

The Act contains, at s. 15, very broad prohibitions on disclosure by electronic service providers, including whether one is subject to an order, the contents of an order, information relied upon by the Minister in making an order, representations made by the electronic service provider or the Minister, the fact that representations were made. This is ridiculous. It may make sense to give the Minister the power to issue gag orders from time to time, where they are of the view that disclosure of the information would compromise law enforcement or national security. 


In this country secrecy should be the exception – and should have to be justified – not the default, particularly with respect to services we use every day and our civil liberties. This is so prone to overreach and possible abuse, and all of it takes place in the shadows.


It is very problematic that an electronic service provider is prohibited from disclosing “information related to a systemic vulnerability or potential systemic vulnerability in electronic protections employed by that electronic service provider”. This would mean that if any electronic service provider were to discover a vulnerability in their system, it would be prohibited by Canadian law from disclosing it to anyone. This may include a prohibition on disclosure to customers who may have been affected by a past or current vulnerability, or even that company’s own contractors who carry out security audits on its systems. For example, if a telco discovers a vulnerability in a router, they will tell the manufacturer of the router and various organizations that work diligently to make sure that the entire cybersecurity community can identify and fix vulnerabilities. 


If a telco finds a vulnerability in a system used by all Canadian telcos (because the government will get to dictate what systems telcos use), they can’t alert the other telcos about that vulnerability. 


Paragraph (g) is actively harmful to Canadians, and will be a huge boon for the bad guys who look for and exploit these vulnerabilities. It really, really has to go. 


The parameters of these prohibitions on disclosure can be subject to regulations made pursuant to s. 17 of the Act. 


Under s. 16, if an electronic service provider is to seek an application for judicial review of any order or decision under the Act, it is prohibited from doing so unless it gives fifteen days’ advance written notice to the Minister, along with a copy of the notice of application. 


Under s. 17, the Government can make regulations respecting confidentiality and security requirements for electronic service providers and persons acting on their behalf must comply. Specifically, it authorizes regulations:


(a)  respecting the disclosure of information referred to in section 15;

(b)  establishing rules of procedure for the protection of information referred to in section 15 in administrative or judicial proceedings;

(c)  respecting requirements related to employees of electronic service providers and other persons whose services may be engaged by electronic service providers, including with respect to their security clearance and location; and

(d)  respecting security requirements with respect to the facilities and premises of electronic service providers.


This is extremely broad, and is not limited to confidentiality and security measures that are reasonably required related to the purposes of the Act. Remember, “electronic service provider” is broad enough to include service providers completely and entirely outside of Canada. 


It potentially includes requirements for all of an ESP’s facilities regardless of location, and paragraph (c) even permits regulations regarding where facilities can be located, and security clearances for employees. 


This is clear overreach. None of it is limited to protecting the security of the lawful intercept and information gathering capabilities dictated by the Act. 

Enforcement and administration

The Act gives the Minister authority to designate persons (or classes of persons) to administer and enforce the Act. These designated persons are given vast powers under s. 19 to enter any place (other than a dwelling) to verify compliance or to prevent non-compliance with the Act. Within such a place, they are authorized to:


(a) examine anything found in the place, including any document or electronic data;

(b) make copies of any document or electronic data that is found in the place or take extracts from the document or electronic data;

(c) remove any document found in the place for examination or copying;

(d) use or cause to be used any computer or data processing system at the place to examine or copy electronic data; and

(e) use or cause to be used any copying equipment at the place to make copies of any document.


The Act places an obligation on every owner of a place, a person in charge of the place and everyone in the place to give all assistance that is “reasonably required” by the designated person, including providing any document or electronic data “they may reasonably require”. In addition, in 19(6), a designated person can bring anyone with them to assist. 


This is not specifically limited to places in Canada, but likely cannot be enforced outside of Canada. Again, this is completely without limits. The designated person can say “I want your entire customer database” and the ESP ostensibly needs to comply. Even more, it would be illegal for an employee there to not assist with this outrageous demand.

Audit orders

Under s. 21, a designated person can order an electronic service provider to conduct an internal audit “of its practices, documents and electronic data to determine whether it is in compliance with any provision of this Act or the regulations.” A copy of the audit must be provided to the designated person, and if the audit uncovers any non-compliance, it must specify the non-compliance and measures taken or to be taken to comply with the relevant provision or order. 

Orders by designated persons 

The Act, at s. 23, gives the designated persons order-making powers. If they believe “on reasonable grounds that there is or is likely to be a contravention of the Act or regulations, they can issue a written, mandatory order requiring an electronic service provider to:


(a)  stop doing something that is or is likely to be in contravention of that provision or cause it to be stopped; or

(b)  take any measure that is necessary to comply with the requirements of that provision or mitigate the effects of non-compliance.


These orders are subject to review by the Minister, on request of the electronic service provider. Unless otherwise ordered by the Minister, the order issued by the designated person must be complied with. 

Administrative monetary penalties and offences

The Act, at s. 27 et seq, provides for a full administrative monetary penalty (AMP) regime that is intended to “promote compliance with this Act and not to punish”, along with penal offences at s. 40 et seq. 


If a contravention results in an AMP, the penalty can be up to CAD $250,000, and if a violation continues more than one day, each day constitutes an additional violation. The due diligence defence is available, as are common law defences. 


The Act provides for liability by corporate “directors, officers or agents or mandataries who directed, authorized, assented to, acquiesced in or participated in the commission of the violation”. A notice of violation will set out the amount of the AMP, which can be simply paid, which amounts to an admission of the violation. Alternatively, the alleged violator can enter into a compliance agreement with the Minister or request a review by the Minister of the acts or omissions that constitute the alleged violation, or the amount of the penalty. 


In a review by the Minister for a violation, the evidentiary standard is balance of probabilities and there is no prescribed appeal from the Minister’s decision. Judicial review would likely be available in the Federal Court of Canada. 


Violations can also be penal offences, which are summary conviction offences with a maximum fine of $500,000. If a violation continues more than one day, each day constitutes an additional violation. As with AMPs, due diligence is a defence and officers/directors can also be convicted if they “directed, authorized, assented to, acquiesced in or participated in the commission of the offence”. It is also an offence to obstruct or make a false or misleading statement to (a) a person authorized to assess or test any device, equipment or other thing, or (b) a designated enforcement person. 


In a nutshell, this part of Bill C-2 has enormous impacts on electronic service providers – globally – and represents a huge overreach with enormous power and discretion given to the Minister and “designated persons”. It has the potential to introduce significant vulnerabilities into the systems we use every day for our most private communications and also may completely upend the practice of information sharing that is the foundation for keeping the internet safe and secure. 


This “Supporting Authorized Access to Information Act” should be taken out of Bill C-2 so it can get the attention, discussion and scrutiny it deserves. I am really, really afraid that it’ll be jammed through Parliament under the guise of strengthening our border to appease the current US government. And we know that once governments get powers, they never surrender them.