(This post is largely a transcript of the YouTube and podcast episode above.)
On September 23, 2025, the Federal Privacy Commissioner and his provincial counterparts in British Columbia, Alberta and Quebec issued a joint report of findings into TikTok. This is a big one. It raises some interesting — and troubling — questions about jurisdiction, children’s privacy, reasonableness, consent, and what it actually means to protect privacy.
In my view, the Commissioners have imposed an almost impossible standard on TikTok — one that, ironically, could actually reduce privacy for users. Let’s unpack what they found, and why I think they may have gone too far.
I’ll note that the finding is more than thirty pages long, with almost two hundred paragraphs. This should be treated as an overview and not a deep dive into all of the minutiae.
TikTok Pte. Ltd., a Singapore-based company owned by ByteDance, operates one of the most popular social-media platforms in the world. In Canada alone, about 14 million monthly users scroll, post, and engage on TikTok.
The investigation examined whether TikTok’s collection, use, and disclosure of personal information complied with PIPEDA, Quebec’s Private Sector Act, and the provincial privacy statutes of Alberta and B.C.
A key preliminary issue was jurisdiction.
The British Columbia Personal Information Protection Act is a bit quirky. It says
Application
3 (1) Subject to this section, this Act applies to every organization.
(2) This Act does not apply to the following: (c) the collection, use or disclosure of personal information, if the federal Act applies to the collection, use or disclosure of the personal information;
TikTok argued that because of this, only one of the Federal Act or the British Columbia Act could apply.
In my view, the response to this argument by the Commissioners is facile. They said:
[22] Privacy regulation is a matter of concurrent jurisdiction and an exercise of cooperative federalism, which is a core principle of modern division of powers jurisprudence that favours, where possible, the concurrent operation of statutes enacted by the federal and provincial levels of government. PIPA BC has been “designed to dovetail with federal laws” in its protection of quasi-constitutional privacy rights of British Columbians. The legislative history of the enactment of PIPEDA and PIPA BC and their interlocking structure support the interpretation that PIPEDA and PIPA BC operate together seamlessly.
[23] PIPA BC operates where PIPEDA does not, and vice versa. In cases such as the present, which involve a single organization operating across both jurisdictions with complex collection, use, and disclosure of personal information, both acts operate with an airtight seal to leave no gaps. An interpretation of s. 3(2)(c) that would deprive the OIPC BC of its authority in any circumstance the OPC also exercises authority is inconsistent with the interlocking schemes and offends the principle of cooperative federalism.
In my view, this has nothing to do with “cooperative federalism”. In this case, they’re waving their hands instead of engaging in helpful legal analysis. The British Columbia legislature chose to say that if PIPEDA applies, PIPA will not. This is not about constitutional law. The Commissioners could have articulated a much more clear and straightforward response to this argument: TikTok collects personal information across Canada, in BC and elsewhere. PIPA applies to “the collection, use and disclosure of personal information that occurs within the Province of British Columbia” (This is from the federal regulation regarding PIPEDA’s application in British Columbia.) So in this joint investigation, BC’s PIPA applies to the personal information of British Columbians and PIPEDA applies to the personal information of individuals outside of British Columbia. They could have said that, but they didn’t. They did say it was about “overlapping protections” and not “silos”. I think this is incorrect. The British Columbia Act and the Federal Regulation clearly say: this is “the BC Commissioner’s silo”, and this is “the Federal Commissioner’s silo.”
So, the investigation moved forward jointly, setting the stage for three major questions:
Were TikTok’s purposes appropriate?
Was user consent valid and meaningful?
Did TikTok meet its transparency obligations — especially in Quebec?
The first issue asked whether TikTok was collecting and using personal information — particularly from children — for an appropriate and legitimate purpose.
TikTok’s terms forbid users under 13 (14 in Quebec), but the Commissioners found its age-assurance tools were largely ineffective. The platform relied mainly on a simple birth-date gate at signup, plus moderation for accounts flagged by other users or automated scans.
As a result, TikTok said that it removes around half a million under-age Canadian accounts each year — but regulators concluded that many more likely go undetected.
It seems to me that terminating half a million accounts a year because they think the user may be underaged is a pretty strong sign that the company is sincere in its desire to NOT have kids on their platform.
They also noted TikTok already uses sophisticated facial- and voice-analytics tools for other purposes, like moderating live streams or estimating audience demographics, but not to keep kids off the platform. The regulators want TikTok to re-purpose these tools for age estimation.
The Commissioners found that TikTok was collecting sensitive information from children — including behavioral data and inferred interests — without a legitimate business need. In their view, that violates the “reasonable person” standard under PIPEDA s. 5(3) and the comparable provisions in the provincial laws.
This part makes my head hurt a bit. The regulators said:
[67] In light of the above (as summarized in paragraphs 64 to 66), we determined that TikTok has no legitimate need or bona fide business interest for its collection and use of the sensitive personal information of these underage users (in the context of PIPEDA, PIPA AB and PIPA BC), nor is this collection and use in support of a legitimate issue (in the context of Quebec’s Privacy Sector Act). It is therefore our finding, irrespective of TikTok’s assertion that this collection and use is unintentional, that TikTok’s purposes for collection and use of personal information of underage users are inappropriate, unreasonable, and illegitimate, and that TikTok contravened subsection 5(3) of the PIPEDA, section 4 of Quebec’s Private Sector Act, sections 11 and 14 of PIPA BC and sections 11 and 16 of PIPA AB.
It’s clear that TikTok does not want children on its platform and takes active steps to keep children off its platform. The regulators were clear that they didn’t think the measures taken were adequate, but I didn’t see them say that TikTok was insincere about this. So they find that TikTok’s purposes for collecting personal information from children was not reasonable.
But TikTok had no purposes for collecting personal information from children. If kids make it through the age-gate and don’t have their account deleted, TikTok still does not want that data. They essentially said: “Your collection of personal information that you do not want and do not try to get is unreasonable.” Ok. I guess that’s their view.
The second issue focused on consent — whether TikTok obtained valid and meaningful consent for tracking, profiling, targeting, and content personalization.
The Commissioners said it did not.
They found that TikTok’s privacy policy and consent flows were too complex, too long, and lacked the up-front clarity needed for meaningful understanding. In particular:
Key information about what data was being collected and how it was used wasn’t presented prominently.
Important details were buried in linked documents.
The privacy policy was not available in French until the investigation began.
And users were never clearly told how their biometric information — facial and voice analytics — was used to infer characteristics like age and gender.
Even for adults, the Commissioners said consent wasn’t meaningful because users couldn’t reasonably understand the nature and consequences of TikTok’s data practices.
And for youth 13–17, TikTok mostly relied on the same communications used for adults — no simplified, age-appropriate explanations of how data is collected, used, or shared.
Under the Commissioners’ reasoning, because the data involved is often sensitive — revealing health, sexuality, or political views — TikTok needed express consent. They found the platform failed that standard.
[81] Additionally, while users might reasonably expect TikTok to track them while on the platform, which they can use for “free”, it is our determination that they would not reasonably expect that TikTok collects the wide array of specific data elements outlined earlier in this report or the many ways in which it uses that information to deliver targeted ads and personalize the content they are shown on the platform. Many of these practices are invisible to the user. They take place in the background, via complex technological tools such as computer vision and TikTok’s own machine learning algorithms, as the user engages with the platform. Where the collection or use of personal information falls outside of the reasonable expectations of an individual or what they would reasonably provide voluntarily, then the organization generally cannot rely upon implied or deemed consent.
The Commissioners’ reasoning is generally coherent, but I’m not sure that it directly leads to a requirement for express consent. Consent can be implied where the individual understands what information is being collected and how it will be used, and it makes sense to take into account whether the individual expects the collection and use. The main issue here is that there was collection and use of information outside the reasonable expectations of the individual. TikTok’s data practices are part of its “secret sauce” that has led to its success. Following the reasoning of the Commissioners … if TikTok had better calibrated the expectations of its users, it could have relied on implied consent.
The Quebec Commissioner took things even further. Under Quebec’s Private Sector Act, organizations must inform the person concerned before collecting personal information.
The CAI found TikTok failed to highlight key elements of its practices and was using technologies like computer vision and audio analytics to infer users’ demographics and interests without adequate disclosure.
The CAI also found that TikTok allowed features that could locate or profile users without an active opt-in action, violating Quebec’s rule that privacy settings must offer the highest level of privacy by default.
Now here’s where I think the Commissioners overreached.
They’re effectively holding TikTok — and by extension, every global digital platform — to a near-impossible standard.
First, on age verification: to exclude all under-13 users, TikTok would need to collect more information from everyone — things like government-issued ID or facial-age scans. That’s exactly the kind of sensitive biometric data privacy regulators have previously warned against.
So in demanding “better” age assurance, the Commissioners are actually requiring more surveillance and more data collection from all users — adults and teens alike. While it may be “protecting the children”, like so many age assurance tools it is actually privacy-invasive.
Second, on consent and transparency: privacy regulators have long said privacy policies are too long, too legalistic, and too hard to read. Yet here, they criticize TikTok for not providing enough detail — for not being even longer and more comprehensive.
So which is it? We can’t reasonably expect the average user to read a novel-length privacy policy, yet that’s what these findings effectively require.
And third, the Commissioners’ reasoning conflates complexity with opacity. TikTok’s algorithms and personalization systems are complex — that’s the nature of modern machine learning. Explaining them “in plain language” is a noble goal, but demanding a full technical manual risks burying users in noise.
In my view, this decision reflects a growing tension in privacy regulation: between idealism — the desire for perfect transparency and perfect protection — and pragmatism — the need for solutions that actually enhance user privacy without breaking the internet.
The regulators seem to be demanding a standard of perfection in a messy and complicated world. These laws can be applied reasonably and flexibly.
One final thing to note: The regulators say that information provided to support consent from young people (over the age of 13 or 14) has to be tailored to the cognitive level of those young people. That means it has to be subjective, in light of the individual. But the Privacy Commissioner of Canada is arguing in the Supreme Court of Canada against Facebook that consent is entirely objective, based on the fictional “reasonable person” (who is NOT a young person). They should pick a lane.
So, where does this leave us? TikTok has agreed to implement many of the Commissioners’ recommendations — stronger age-assurance tools, better explanations, new teen-friendly materials, and improved consent flows.
But whether these measures will truly protect privacy — or simply demand more data from more users — is a question regulators and platforms alike still need to grapple with.