Skip to main content

TikTok Unveils Family Safety Mode Amid Increasing Child-Privacy Concerns

TikTok has unveiled Family Safety Mode, a new feature that allows the parents and guardians of minor users to control screen time, police the direct messages feature and restrict inappropriate…

TikTok has unveiled Family Safety Mode, a new feature that allows parents and guardians of minor users to control screen time, police direct messages and restrict inappropriate content. The tool was announced by TikTok’s head of trust and safety, EMEA Cormac Keenan, via a post on the company’s official website on Wednesday (Feb. 19).

The new feature, which is now available in the U.K. and will roll out in additional markets over the next several weeks, allows parents to link their TikTok accounts to their children’s in order to regulate their usage and interactions. Though these “Digital Wellbeing” controls were already available for individual users, Family Safety Mode is a way for parents to more closely monitor their children’s activity.

In the same post, TikTok also revealed that it has partnered with some of the app’s most popular influencers to create prompts in users’ feeds that will urge them to be more mindful of the time they’re spending on the platform. Screen-time management, which allows users to set limits on the amount of time they spend on TikTok every day, was previously introduced in April 2019 but has never been present in feeds before.


The features come at a time of increased scrutiny for TikTok regarding its policies around underage users. Over the past year, the app has come under fire for alleged child-privacy violations, including from the Federal Trade Commission, which sued the company in 2019 for violating the Children’s Online Privacy Protection Act. That accusation sprang from children’s use of, the video-sharing app that was bought by TikTok’s parent company ByteDance in December 2017 and absorbed into TikTok the following August.

Though TikTok settled that suit last February for $5.7 million — the largest fine ever collected by the FTC in a children’s privacy case — in December a class-action complaint alleged that the settlement hadn’t gone far enough, stating, “Defendants have not made whole the millions of consumers harmed by their unlawful conduct.”

That still-pending suit, which was lodged in federal court in Illinois by the mothers of two minors, alleged that “surreptitiously tracked, collected, and disclosed” the personal information of minor children without the consent of their parents or guardians and sold their data to third-party advertisers. It further alleged that the app’s use of geolocation data made minors more vulnerable to attacks by child sexual predators.