TikTok child data protection is at the center of a sweeping Canadian privacy investigation that found the platform’s safeguards for minors inadequate and its collection of children’s personal data overly intrusive. Canadian privacy officials say hundreds of thousands of children are on TikTok annually despite the company’s stated age minimum of 13, and that the app has gathered sensitive information that was then used for marketing and content targeting. TikTok disputes parts of the findings but has pledged to introduce measures to “strengthen our platform for Canadians.”
This in-depth explainer unpacks what the Canadian investigation concluded, how TikTok is responding, what children’s data protections should look like in 2025, and the global regulatory context that’s rapidly reshaping TikTok’s obligations—from Europe’s headline fines to U.S. national security legislation and device bans. You’ll also find practical steps for families and schools, a regulatory cheat sheet for brands, and key sources to follow as enforcement actions evolve.

Key takeaways at a glance
- Canadian privacy authorities say TikTok’s protections for minors are inadequate and that children’s data has been collected and used for targeting.
- TikTok says it disagrees with some findings but will implement additional safeguards in Canada.
- The case aligns with a broader global trend: EU and UK regulators have fined TikTok over children’s data, while Canada, the EU, and the U.S. have restricted TikTok on official devices, and the U.S. passed legislation in 2024 that could force a divestiture.
- Parents and educators can reduce risks by using Family Pairing, reviewing privacy settings, and limiting data sharing. Brands need to reassess youth-targeted campaigns and age-gating compliance.
What Canada’s privacy investigation found Canada’s Privacy Commissioner, Philippe Dufresne, led a coordinated investigation into TikTok’s data practices, with a focus on minors. Officials concluded the popular short-video platform collects vast amounts of data—ranging from device identifiers and usage activity to location indicators and behavioral signals—and leverages this data to personalize content and ads. Investigators said that in practice, TikTok’s efforts to block under-13 users and to obtain meaningful consent or transparency for minors were not sufficient.
Key issues highlighted by Canadian officials:
- Inadequate age-gating: Despite TikTok’s policy that under-13s should not use the main app experience, hundreds of thousands of children have been active on the platform in Canada each year.
- Sensitive data collection and use: Authorities said TikTok collected sensitive information from a large number of children in Canada and used it for marketing and content targeting.
- Transparency and consent gaps: Officials criticized the clarity and accessibility of disclosures, particularly for young users, noting that data-driven targeting can pressure engagement and potentially exacerbate harms.
At a news conference, Commissioner Dufresne emphasized the potential harm of highly personalized feeds for youth: a recommendation engine built on extensive behavioral tracking can have outsized impacts on attention, mental health, and online safety. He also said TikTok had agreed to enhance measures to deter underage use and to more clearly set out how data is used.
TikTok’s response TikTok said it welcomed the Canadian investigation, expressed commitment to privacy and transparency, and indicated that officials had agreed to several proposed platform enhancements. The company also said it disagreed with some of the findings, although it has not yet publicly itemized each point of disagreement. In similar cases elsewhere, TikTok has typically emphasized:

- Investments in age assurance and detection
- Under-18 account defaults that restrict messaging and visibility
- Family Pairing controls for parents and guardians
- Limits on targeted advertising to teens in certain jurisdictions
- Regular transparency reports and data access pathways
As new Canada-specific changes roll out, expect updates to its local privacy notice, educational prompts across the app for teen accounts, and possibly revised age verification flows or consent mechanisms tailored to Canadian law.
Why TikTok child data protection matters
Children’s personal data is intrinsically sensitive. For preteens and teens, digital profiling can shape what they see, when they see it, and how long they stay. Three overlapping risks matter most:
- Profiling and targeting
- Personalization engines infer interests, vulnerabilities, mood, and attention patterns. When content is tuned to maximize engagement, children may be nudged toward extreme content clusters or repetitive loops that crowd out diverse information and healthy breaks.
2. Privacy and security exposure
- The collection of identifiers, location signals, and activity logs, if misused or breached, raises safety concerns. Stalkers, doxxers, or malicious actors can exploit leaks, and persistent identifiers can follow children into adulthood.
3. Developmental and mental health impacts
- Highly sticky, personalized feeds can exacerbate anxiety, body image concerns, or sleep disruption. While social media can provide community and learning opportunities, the imbalance between benefits and harms worsens when data practices are opaque and controls are weak.
Canada’s legal context: What rules apply?
- PIPEDA (federal): Canada’s Personal Information Protection and Electronic Documents Act governs how private-sector organizations collect, use, and disclose personal information in commercial activities. It requires knowledge and meaningful consent, appropriate safeguards, and limits on collection and use.
- Provincial privacy laws: Quebec, British Columbia, and Alberta have private-sector privacy laws deemed substantially similar to PIPEDA. In Quebec, Law 25 introduces enhanced obligations including privacy impact assessments and specific rules for minors.
- Children’s privacy expectations: The Office of the Privacy Commissioner of Canada (OPC) has long held that consent from young children is not meaningful without parental involvement, and that organizations must tailor explanations to be understandable to youth.
- Reform on the horizon: Canada’s proposed Digital Charter Implementation Act (Bill C-27) would modernize privacy rules and create an AI-specific framework. If enacted, it could tighten youth protections, codify privacy by design, and expand enforcement.
How TikTok collects and uses data Most modern social platforms—TikTok included—rely on extensive telemetry and signals to tailor content. Typical categories include:
- Account and device data: Username, age or birthdate, email/phone, device model, IP address, and app version.
- Interaction and behavioral signals: Likes, comments, shares, watch time, replay rate, scroll patterns, search queries, and content topics.
- Location indicators: IP-based geolocation and, where permissions are granted, more precise signals.
- Inferred interests: On-platform behavior combined with contextual signals to build a profile of what a user is likely to engage with next.
Why this matters for TikTok child data protection:
- Volume and velocity: Short-form videos generate dense interaction signals—micro-engagements every few seconds—producing richly detailed behavioral profiles, especially powerful for prediction and targeting.
- Feedback loops: The more a child watches a category, the stronger the reinforcement becomes. Without guardrails, personalization can drift toward extreme content or keep minors engaged longer than intended.
- Ads and monetization: Even if a platform restricts some ad categories for teens, the underlying profiling infrastructure can still shape outcomes. Policymakers focus not only on ad delivery but on the data pipelines that drive attention.
Global enforcement landscape Canada’s conclusions align with a worldwide pattern of heightened scrutiny:
- European Union fines and proceedings
- In 2023, Ireland’s Data Protection Commission fined TikTok €345 million over children’s data processing practices, citing transparency and default settings concerns. See the DPC press release: https://www.dataprotection.ie/en/news-media/press-releases
- EU institutions have also moved to limit TikTok on official devices due to cybersecurity considerations. See the European Commission’s announcement on removing TikTok from corporate devices: https://ec.europa.eu/info/news/commission-staff-instructed-remove-tiktok-corporate-devices-2023-feb-23_en
- United Kingdom enforcement
- In April 2023, the UK Information Commissioner’s Office fined TikTok £12.7 million for unlawful processing of children’s data. ICO announcement: https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2023/04/tiktok-fined-12-7m/
- United States national security and device bans
- In 2022–2023, U.S. federal agencies implemented a ban of TikTok on government devices. OMB guidance: https://www.whitehouse.gov/wp-content/uploads/2023/02/M-23-13-No-TikTok-on-Government-Devices-Implementation-Guidance.pdf
- In April 2024, President Biden signed legislation that could result in a U.S. ban if TikTok’s parent company ByteDance does not divest within specified timelines. Coverage from Reuters: https://www.reuters.com/world/us/biden-signs-bill-that-could-ban-tiktok-if-bytedance-fails-divest-2024-04-24/
- India’s nationwide ban
- India blocked TikTok and dozens of other apps in 2020, citing security and data concerns. Government announcement: https://pib.gov.in/PressReleasePage.aspx?PRID=1635206
- Canada’s federal device ban
- The Government of Canada restricted TikTok on government-issued mobile devices in 2023 as a precautionary cybersecurity measure. Statement: https://www.canada.ca/en/treasury-board-secretariat/news/2023/02/statement-on-social-media-application-tiktok.html
This growing body of actions underscores a convergence: regulators are treating children’s data protection, platform design, and national security as intertwined concerns.

What TikTok says it’s doing to protect minors TikTok has introduced a range of features and policies meant to reduce risks for young users. These vary by jurisdiction, but commonly include:
- Age gating and age assurance: Users are asked to provide birthdates; TikTok says it deploys detection tools to identify potential underage usage.
- Teen defaults: For many regions, accounts for users under 16 are private by default; direct messaging restrictions apply for under-16s or under-18s depending on the country; duet/stitch features and downloading are restricted by default for younger teens.
- Family Pairing: Parents and guardians can link accounts to manage screen time, limit direct messages, and control content filters.
- Screen time tools: Default daily screen limits for teens (e.g., 60 minutes, with prompts to extend) and weekly digital well-being nudges.
- Ads and data use: Restrictions on personalized ads for users under a certain age in specific regions; prohibition of some sensitive ad categories for teens.
- Transparency: Regular transparency reports and regional privacy notices; in-app prompts that explain certain features.
TikTok has also pledged to improve clarity around data use and minors’ experiences in Canada. Still, Canadian officials say more is needed to deter underage sign-ups effectively and to make disclosures truly understandable to young audiences.
What “adequate” looks like for TikTok child data protection Given the findings, what should platforms deliver to satisfy regulators and protect children?
- Robust age assurance, not just age gates
- Beyond self-declared birthdates, companies can use multifactor age signals, risk-based verification for sensitive features, and privacy-preserving age estimation. Accuracy must be balanced with privacy to avoid encouraging risky data submissions (e.g., government IDs).
- Data minimization and purpose limitation
- Collect only what’s necessary; avoid tracking sensitive signals (precise geolocation, biometrics) for teens; segregate teen data from adult profiling pools; and prevent cross-service tracking by default.
- Default-strong privacy for minors
- Private-by-default profiles; no public contact discovery; restricted messaging and duet/stitch features; no downloads of minors’ videos; no targeted ads based on sensitive attributes or deep interest profiling.
- Meaningful transparency and consent
- Explain data use in age-appropriate language; use layered notices and short in-app explainers; obtain parental involvement for younger teens where appropriate and aligned with local law.
- Safety and content governance integrated with data controls
- Strengthen content classification and filtering; expand options to reset recommendations; limit repetitive exposure to potentially harmful themes; provide friction to encourage breaks.
- Independent oversight and audits
- Commit to periodic, independent privacy and safety assessments; publish redacted audit findings; empower researchers with vetted data access programs that protect users but enable accountability.
Implications for parents, educators, and teens Families can reduce risk without eliminating positive online experiences. Practical steps:
- Use Family Pairing
- Link your child’s account to manage screen time, content filters, and direct messaging limits. Review settings monthly.
- Dial in privacy settings
- Ensure the account is private; turn off “Suggest your account to others”; disable video downloads; limit stitches and duets to “No one” or “Friends.”
- Tame the algorithm
- Use “Not Interested” generously; long-press to hide topics; periodically clear watch history; consider using the “Refresh your For You feed” feature if available in your region.
- Guard personal information
- Avoid posting school names, uniforms, neighborhoods, or recurring locations. Turn off location permissions in phone settings.
- Plan for balance
- Set screen time limits, enforce “no phones in bedrooms after lights-out,” and agree on tech-free routines (meals, homework windows).
- Encourage media literacy
- Discuss how recommendation engines work, how to spot manipulative content, and the importance of critical thinking.
What brands and creators should do now For advertisers, agencies, and creators, TikTok’s changing compliance posture affects campaigns:
- Reassess age-gating and audience definitions
- Use first-party verifications where possible and align segments with local youth advertising standards. Avoid interest targeting that could implicate minors.
- Audit creative and tracking
- Ensure creatives and hashtags don’t inadvertently target teens with age-inappropriate products. Review pixels/SDKs for data minimization and honor do-not-track signals.
- Strengthen consent flows
- If you collect data via TikTok-linked pages, offer clear consent prompts, separate teen pathways, and a parental involvement process when needed.
- Monitor regulatory updates
- Canadian outcomes may bring supervisory follow-ups. EU DSA, UK’s Children’s Code, U.S. state privacy laws (e.g., California) and federal actions can quickly change what is allowed.
Canadian enforcement: What comes next Following the investigation, possible next steps include:
- Formal compliance commitments from TikTok in Canada
- Specific deadlines to deploy enhanced age assurance and clearer disclosures
- Follow-up audits or assessments by privacy officials
- Guidance for other platforms based on lessons from the TikTok case
Canada’s evolving privacy reform (including Bill C-27) could solidify requirements around children’s data in the coming years, tightening enforcement levers and aligning Canada more closely with EU-style obligations.
Balancing benefits and risks TikTok can be a channel for creativity, community, and learning—music, language, sports, humor, and activism flourish there. But for younger users, the tight coupling of data collection, algorithmic prediction, and infinite scrolling raises specific concerns. The Canadian investigation lands squarely on that tension: how much profiling and personalization is too much when the user is a child?
For platforms, the bar is rising: not merely “don’t sell data about teens,” but design systems where minors’ privacy and safety are the defaults, and where the business model does not depend on ever-deeper surveillance of young users. For policymakers, the task is to set clear, interoperable rules that reduce risk without eliminating the genuine benefits of online participation.
TikTok child data protection: Frequently asked questions
- Is TikTok banned in Canada?
- No. Canada has restricted TikTok on government devices but not for the general public. The privacy investigation may lead to enhanced obligations, not a nationwide ban.
- Can teens still use TikTok in Canada?
- Yes, with conditions. TikTok says it is not intended for users under 13, and it provides teen-specific defaults and controls for older minors. Canadian officials are pushing for stronger measures against underage use and clearer transparency.
- What if my child lied about their age?
- Parents can use Family Pairing to manage some settings. If you believe your child is under the age threshold, you can request account removal or age correction through TikTok’s support channels.
- Does TikTok sell children’s data?
- TikTok states it does not sell personal data. Regulators focus less on “sale” and more on collection, profiling, targeting, and transfers—activities that can raise risks even without a traditional sale.
- What’s the difference between targeted and contextual ads?
- Targeted ads use profile and behavioral data about the user; contextual ads are based on the content of the video currently being viewed. For minors, many regulators favor contextual advertising or very limited targeting.
Action plan for families in 15 minutes
- Update the app and review privacy settings.
- Turn on Family Pairing and set a daily screen limit.
- Restrict direct messages and disable downloads.
- Teach your child to use “Not Interested” and report/block features.
- Revisit settings monthly and discuss new features together.
The big picture for TikTok and regulators The Canadian investigation signals a broader expectation: platforms must prove—not just declare—that their systems protect minors by design. In 2025, privacy and safety are converging into a single test: does the platform minimize data collection, resist manipulative patterns, make teen experiences private by default, and communicate in language kids understand?
As enforcement tightens across Canada, Europe, the U.S., and beyond, TikTok and its peers will need to continuously iterate: stronger age verification, leaner data pipelines for teens, safer recommendation defaults, and verifiable compliance controls. The companies that treat children’s data protection as a product requirement—not a legal checkbox—will be the ones that keep trust and avoid penalties.
Resources and further reading
- Office of the Privacy Commissioner of Canada (OPC) – Investigation announcement regarding TikTok (2023): https://www.priv.gc.ca/en/opc-news/news-and-announcements/2023/an_230223/
- PIPEDA overview (OPC): https://www.priv.gc.ca/en/privacy-topics/privacy-laws-in-canada/the-personal-information-protection-and-electronic-documents-act-pipeda/
- Government of Canada – TikTok on government devices: https://www.canada.ca/en/treasury-board-secretariat/news/2023/02/statement-on-social-media-application-tiktok.html
- European Commission – Removal of TikTok from corporate devices: https://ec.europa.eu/info/news/commission-staff-instructed-remove-tiktok-corporate-devices-2023-feb-23_en
- UK ICO – TikTok fined £12.7m over children’s data: https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2023/04/tiktok-fined-12-7m/
- Ireland DPC – Press releases (includes TikTok 2023 decision): https://www.dataprotection.ie/en/news-media/press-releases
- U.S. OMB – No TikTok on Government Devices Implementation Guidance: https://www.whitehouse.gov/wp-content/uploads/2023/02/M-23-13-No-TikTok-on-Government-Devices-Implementation-Guidance.pdf
- Reuters – Biden signs bill that could ban TikTok if ByteDance fails to divest: https://www.reuters.com/world/us/biden-signs-bill-that-could-ban-tiktok-if-bytedance-fails-divest-2024-04-24/
- Government of India – Ban on certain apps including TikTok (2020): https://pib.gov.in/PressReleasePage.aspx?PRID=1635206
- TikTok Safety Center: https://www.tiktok.com/safety/en/
- TikTok Newsroom – Teen and family safety updates: https://newsroom.tiktok.com/en-us
- American Psychological Association – Health Advisory on Social Media Use in Adolescence: https://www.apa.org/topics/social-media-internet/health-advisory-adolescent-social-media-use
- Canada’s Digital Charter Implementation Act (C-27) – overview: https://www.canada.ca/en/innovation-science-economic-development/news/2022/06/digital-charter-implementation-act-2022.html
Disclosure and methodology This article synthesizes the public statements of Canadian privacy officials, TikTok’s published safety and privacy materials, and regulatory actions and reports from trusted authorities. Where specific investigative details are not publicly itemized, this piece focuses on the themes and remedies that officials emphasized: inadequate age controls, sensitive data collection and profiling of minors, and commitments by TikTok to enhance transparency and protections in Canada.
Bottom line TikTok child data protection is now a front-page regulatory issue in Canada and around the world. Whether through fines, audits, device restrictions, or potential structural changes, the message is consistent: children’s privacy isn’t negotiable. Platforms must adopt age-appropriate design, minimize data collection, and prioritize teen safety by default. Canadians should expect clearer disclosures and stronger age assurance on TikTok in the months ahead—backed by regulatory scrutiny to ensure promises are kept.