I want to talk about a recent decision from the Ontario Divisional Court that affirms the Information and Privacy Commissioner’s very expansive view of what counts as a “use” or “loss” of personal information under Ontario’s privacy laws. Spoiler alert: it probably doesn’t mean what you think it means.
This case came out of ransomware attacks on two organizations: the Hospital for Sick Children in Toronto, known as SickKids, and the Halton Children’s Aid Society. Neither organization’s investigation found that hackers had actually looked at, copied, or stolen personal information. But both were still found by the Information and Privacy Commissioner of Ontario—the IPC—to have breached their obligations to notify individuals. And when the case went to court, the judges deferred to the regulator. Let’s look at what happened.
In 2022, both SickKids and Halton were hit by separate ransomware attacks. If you’re not familiar, ransomware is malicious software that encrypts systems and data so that they can’t be accessed unless a ransom is paid to get the decryption key.
Here, the attackers encrypted the systems at the container level—think of it like changing the lock on a filing cabinet. The files inside were untouched, unviewed, and un-exfiltrated, but temporarily unavailable.
Both SickKids and Halton promptly investigated, brought in cybersecurity experts, and concluded that there was no evidence of any data being accessed or stolen. They even notified the IPC, though they argued this was just a courtesy because the legal requirement to notify individuals wasn’t triggered. SickKids went further, posting public updates on its website and social media. But they didn’t include the mandatory line about the right to complain to the Information and Privacy Commissioner.
The IPC saw things differently. In 2024, it issued two decisions (Sick Kids, Halton CAS) . It found that both organizations had experienced a privacy breach involving an unauthorized “use” and “loss” of personal information. The trigger is an unauthorized “use” or an unauthorized “loss” of personal information. They concluded that the information was “used” and “lost” in an unauthorized manner, triggering the requirement to report to the Commissioner and to notify affected individuals. And to advise them of their right to complain to the Commissioner.
Why? The IPC reasoned that encrypting the containers “handled” or “dealt with” the personal information inside them, making it inaccessible to authorized users. That, it said, was enough to count as a “use.” And because the information was unavailable for a period of time, that was also a “loss.”
It should be noted that encryption at the container level did not expose any personal information and did not create any sort of risk to the affected individuals once remedied.
For Halton, the IPC ordered notice to affected individuals—though by way of a website posting rather than direct notification. For SickKids, since it had already gone public, no remedial order was made.
Both SickKids and Halton challenged the IPC’s decisions in court. The Ontario Hospital Association even intervened to support them, arguing that this interpretation of “use” and “loss” would lead to pointless over-notification and compliance burdens.
Now, this is where what we lawyers call the “standard of review” becomes important. When a court reviews an administrative decision, like one from the IPC, it doesn’t just substitute its own view of the law. Under a framework established by the Supreme Court of Canada in a case called Vavilov, the default standard is “reasonableness.” That means the court will defer to the regulator’s decision so long as it is “reasonable”, meaning it is internally coherent, justified, and within the bounds of the law.
In other words, unless the regulator really went off the rails, the court won’t step in.
The Divisional Court—Judges Sachs, Lococo, and Kurke—dismissed both the judicial reviews and Halton’s appeal.
They held that the IPC had reasonably interpreted “use” to include encryption that denied authorized users access to information, even if no one else ever looked at it. They also upheld the IPC’s finding that this was a “loss” of information, again because of the temporary unavailability.
The Applicants had argued that notification should only be required where individuals’ privacy interests were actually affected—where there’s a real risk of harm, like theft or misuse. The Court rejected that. Ontario’s Personal Heath Information Protection Act and Child, Youth and Family Services Act, 2017 don’t contain a “risk of significant harm” threshold. The statutes just say notify if information is “used” or “lost.” That’s the threshold.
The Court emphasized that words like “use” don’t necessarily carry their ordinary, common-sense or dictionary meaning. Instead, they take on the meaning given by the regulator, so long as that interpretation is reasonable.
I’ll be blunt: I don’t agree with this outcome. I understand why the Court deferred to the IPC, but I don’t agree with the IPC’s interpretation of those words. Encrypting a server at the container level is not, in any meaningful sense, a “use” of personal information. In any ordinary sense of the word, it was not “used”. Nobody viewed it, nobody copied it, and nobody exfiltrated it. The information was never actually touched. Ones and zeroes are moved around hard drives every minute of every day, and we don’t think of that as data being “used”.
And calling this a “loss”? At best, it was a temporary disruption. To me, that’s not what “loss” means. Putting it on a thumb drive and misplacing it would be a “loss”. If there was a temporary power cut to their data centre and the information was not accessible for an hour, we would not think that there’s any real unauthorized “loss” of the data. There was no risk of identity theft, no misuse, no real risk of harm to the individuals involved.
Here’s where I think the problem lies: Ontario’s PHIPA and the CYFSA don’t have a risk-based threshold. They require notification if there’s a “use” or a “loss,” regardless of whether there’s any actual risk to the individual. Compare that to the federal private sector law, PIPEDA. Under PIPEDA, an organization has to notify affected individuals and report to the federal Privacy Commissioner only if there’s been a “breach of security safeguards” that creates a “real risk of significant harm”.
That’s a sensible threshold. It filters out situations like this one, where the systems were disrupted but no one’s privacy was actually at risk. In my view, the PIPEDA standard is better. It focuses on the individual’s actual risk, rather than forcing organizations to notify just because a breach happened. Without a risk filter, you end up with over-notification, unnecessary costs, and notice fatigue, which ultimately makes people take these notices less seriously.
Because Ontario’s statutes don’t include a “real risk of significant harm” threshold, regulators like the IPC are free to take a very broad approach to words like “use” and “loss.” And courts, applying the deferential reasonableness standard, are not going to interfere.
So what does this mean for organizations in Ontario? It means that a word like “use” doesn’t always mean what you think it means. Regulators may adopt broader, purposive interpretations—especially in the context of cyberattacks. And courts, applying the reasonableness standard, will generally defer to those interpretations.
It also reinforces to me that privacy law is not really a practice area that one can just dabble in. Words in the statutes don’t necessarily mean what you’d think they mean. They have meanings given to them by the regulators, and the courts will generally defer to that interpretation.
The lesson is this: don’t rely on common-sense definitions of terms like “use,” “loss,” or “disclosure.” And don’t assume that the risk-based federal standard applies provincially. Look at how regulators are interpreting these terms in practice, because that’s what will stand up in court.
No comments:
Post a Comment