On August 27, 2025, the Information and Privacy Commissioner of Ontario released a revised finding against the University of Waterloo. The initial report was issued in June this year and I should have done an episode on it then. The case involved what looked like a pretty ordinary thing on campus — vending machines. Except these weren’t just any vending machines. They were “intelligent vending machines,” installed by a third-party service provider, and they secretly used biometric face detection technology.
That sounds creepy and the University was found to have violated Ontario’s public sector privacy law. It’s not as cut and dried, but there are some interesting takeaways from that decision.
Nobody on campus was aware that these vending machines use face detection technology until one of the machines malfunctioned and flashed an error message on its screen — basically outing itself as running “FacialRecognition.App.exe.” Understandably, students complained. It got a lot of media coverage and some buzz on Reddit.
The Information and Privacy Commissioner of Ontario investigated.
At the outset, the University of Waterloo challenged whether the Commissioner even had jurisdiction here. The University argued that this wasn’t really about Ontario’s Freedom of Information and Protection of Privacy Act — instead, they said it was governed by the federal Personal Information Protection and Electronic Documents Act or PIPEDA. Their reasoning? Selling snacks through vending machines is a commercial activity. And PIPEDA applies to the collection, use and disclosure of personal information in the course of commercial activity. And that meant the federal law applied, not the provincial law.
They also argued that if the vending machines didn’t actually capture personal information — as the manufacturer claimed — then there was nothing for the Commissioner to investigate. And finally, Waterloo tried to limit its responsibility by pointing out that it never contracted for biometric collection in the first place. In their view, if the vendor went off and deployed face detection technology, that wasn’t for them, they didn’t ask for it and they should not be on the hook for it.
The Commissioner rejected all of those jurisdictional arguments. The decision emphasized that under FIPPA, Ontario institutions like universities are responsible for personal information collected by vendors operating on their behalf — even when those vendors are engaged in activities with a commercial character. The Commissioner leaned on the “double aspect” doctrine in our constitutional jurisprudence: both federal and provincial laws can apply at the same time. In other words, even if PIPEDA could cover some of the activity, that doesn’t oust FIPPA.
So the bottom line on the jurisdiction question was that the University of Waterloo couldn’t escape the Commissioner’s oversight just by pointing to federal law or saying “we didn’t know.” Once personal information was being collected on its campus by machines it authorized, the University was on the hook under FIPPA
On the merits, the Commissioner concluded that the machines were capturing facial images, even if only for milliseconds. Not surprisingly, these facial images qualify as “personal information” under Ontario’s Freedom of Information and Protection of Privacy Act (FIPPA).
The collection wasn’t authorized by law, wasn’t necessary for selling chips and chocolate bars, and no notice was given.
Therefore, in the IPC’s view, Waterloo had violated FIPPA.
In order to find Waterloo at fault, or in violation of FIPPA, the IPC asks and answers three questions:
The IPC asked: “Did Waterloo “collect” personal information?” The Commissioner said yes. Even though the vendor claimed the system only processed images in real time, the machines captured full facial images in memory to estimate age and gender. That’s enough to count as a collection of personal information.
But really? Was it really Waterloo who “collected” personal information? Legally, yes. They had a vendor who was supplying goods and services on their behalf and the University is responsible for that.
Then the IPC asked: “Was the collection compliant with FIPPA?” No. Section 38(2) of FIPPA says you can only collect personal information if it’s expressly authorized, needed for law enforcement, or necessary to carry out a lawful activity. Selling snacks doesn’t need biometric data. It might be “helpful” for marketing — but helpful isn’t the same as “necessary.” And also, no notice was given that personal information was being collected and why.
Finally, the IPC asked: “Did Waterloo have reasonable measures to protect personal information?” The Commissioner said they had decent contract clauses, but they fell down in procurement. They didn’t do the privacy risk assessment that could have flagged the biometric capability. That failure meant they didn’t exercise enough due diligence, and so they’re responsible.
Here’s where I think the finding is problematic. Waterloo had no knowledge of the biometric functionality. They weren’t using it, they didn’t ask for it, and their contract didn’t mention it. The vendor who responded to the RFP for vending machines apparently wasn’t aware of this functionality in some of the machines they provided. That other supplier embedded this capability, and at the time nobody was aware of it.
Due diligence usually asks the question with reference to what a reasonably prudent person would have done in the same circumstances. Without the benefit of hindsight, I think the University met that standard. But they could have done better, so the University is still on the hook for a privacy violation. It seems to be holding them to a higher standard, based on what we know now.
It could have been enough to just give them a gentle slap upside the head, saying it’s 2025 and we need to assume that anything that uses electricity – and particularly if it’s a “connected device” – has the potential to collect personal information. You need to check. Even vending machines.
Think about what this means in practice:
Does every university, hospital, or government office now need to disassemble or reverse-engineer every piece of technology it procures? Almost.
Do they need to anticipate hidden biometric features in a vending machine?
Or test for surveillance capabilities in every piece of software?
That’s a pretty heavy burden — one that goes far beyond what most organizations reasonably do. I guess the standard for reasonable diligence has to be raised.
Yes, we want institutions to take privacy seriously. Yes, procurement processes should involve risk assessments. But here, it feels like the University is being faulted for not uncovering something that was essentially hidden. I’m not sure we can fault them for not asking at the time whether a vending machine used biometrics. We know now, but I don’t think they should be expected to have known to ask back then.
While the vendor was not in the cross-hairs of the IPC’s investigation, vendors need to be mindful. If you build a product with biometric capabilities, you should have to disclose it — clearly and up front. If it’s an “internet of things” connected thing, it should be clearly identified as such. There probably is a boilerplate term in contracts that put the vendor on the hook if they cause the customer to violate any applicable law.
In the end, a finding of having violated FIPPA isn’t like a criminal charge. The IPC issued two recommendations, which the university agreed to implement. First was to review their policies to make sure that future collection of personal information complies with FIPPA. Second was to implement practices to carry out necessary due diligence to identify, assess and mitigate any potential risks to personal information throughout the entire procurement process, including during the planning, tendering, vendor selection, agreement management and termination phases.
There’s a lesson here for everyone: I guess it’s time to update all your procurement and vendor documentation to ask about any connected or biometric features. Ask detailed questions about every bit of gear being installed and fully understand their capabilities. And I’d include reps and warranties in my contacts allowing for the termination of agreements if there has been any misrepresentation about the possible collection of personal information.
One thing also to note is I think this would have gone differently for the university if the vendor wasn’t the university’s service provider. As I mentioned before, the university is on the hook for all personal information collected by their service providers, whether they wanted the information collected in the first place. But if the university had structured the arrangement differently, they likely would have avoided that direct responsibility. For example, if the agreement was more like the bare rental of space for the placement of vending machines on campus, the element of custody or control of the data likely would not have been there. Imagine the university enters into a lease with Starbucks to put a coffee shop in the library atrium. In such a scenario, you wouldn’t really see the University as being responsible for Starbucks’ collection of personal information as part of the Starbucks Rewards loyalty program. Or maybe the privacy commissioner would take a different view? I kind of hope not.
In any event, there are more than a few lessons to learn from this finding.
No comments:
Post a Comment