TikTok announces new ways to filter out mature or “potentially problematic” videos

The short video app is changing its viewing experience so users now see fewer videos on topics that “may be fine as a single video, but potentially problematic if viewed multiple times,” according to Cormac Keenan, manager. trust and security within the company. . Keenan cited topics related to dieting, extreme fitness, and sadness as examples of such content. (TikTok rival Instagram has also previously tried to block teens from seeing certain weight-loss products.)

TikTok also said it was rolling out a new system that organizes content based on thematic maturity, much like the rating systems used in film and TV. The new safeguards will assign a “maturity score” to videos detected as potentially containing mature or complex themes.

The goal, according to Keenan, is “to help prevent content with overtly mature themes from reaching 13-17 year old audiences”.

Late last year, senators questioned executives from TikTok, YouTube and Snap about the steps their platforms were taking to protect teens online after a Facebook whistleblower renewed concerns about the impact of social media platforms on their younger users.
Additionally, a coalition of state attorneys general launched an investigation earlier this year specifically into TikTok’s impact on young Americans. In a statement at the time, TikTok said it limits its features by age, provides tools and resources for parents, and designs its policies with the well-being of young people in mind.

In Wednesday’s blog post, Keenen said the company is “focused on protecting the teen experience” and will be adding new functionality to provide more detailed content filtering options in the coming weeks.

Comments are closed.