in ,

TikTok is putting age restrictions in its video recommendation algorithm to protect younger users

TikTok has been getting more serious about the safety of its younger users. It is now testing how to age-restrict some content.

Viral video sharing platform, TikTok, is now working on how to filter the content that teen users see on the app. This safety method will use age to hide some content deemed inappropriate for younger audiences.

The company had recently come under fire for the harm it could cause to inexperienced users of its popular platform. Many industry experts found TikTok lacking in how it handles user safety, especially teen users.

In its response to the criticism, TikTok’s vice president of Public Policy told the US Congress that the platform was working on screening videos by age. The company’s latest announcement proves the social network was being sincere.

The concern for most safety advocates is TikTok’s video suggestion algorithm, which is famous for being very accurate. However, TikTok will now introduce age-based parameters that will classify the videos. According to Tracy Elizabeth, lead for global issue policy, the parameters include content maturity and thematic comfort zones.

In an interview, Elizabeth explained how the new safety feature works, “When the system is fully launched, content that we’ve identified as containing overtly mature themes could be restricted from teens. And for content that has less … mature themes, our community members are going to be able to choose the comfort zones or content maturity that they would prefer to skip or opt into.”

TikTok has not revealed how it determines a video’s maturity level, but Elizabeth said the classification is still in an innovation phase, suggesting TikTok has not finalized the method yet. However, she revealed that the technique could resemble how content for film, television, and video games are rated. “We know that there’s family-ish content, there’s teen-ish content, there’s adult-ish content. What we’d like to do is … say ‘here you go: you can pick for yourself what is that category that you feel most comfortable with.'”

Elizabeth added that parents or guardians would be able to set their preference for their children by using the platform’s Family Pairing settings.

This classification method matches TikTok’s effort to allow creators to indicate whether their videos are suitable or targeted at children or adults or a general audience. The platform will obviously use this to control what is suggested to younger users when the feature rolls out.

TikTok is already testing restricting videos by age on a small scale, but a general roll-out is still far away. Elizabeth mentioned that many of the decisions on how the feature will work have not been made yet. However, it is heartwarming to see that the platform is taking such steps.

Apart from this new age-restriction feature, TikTok has released a new set of guidelines for the platform. It has now singled out suicide hoaxes and risky challenges while addressing what is acceptable and not. The new policy now bans videos encouraging eating disorders, including content that promotes extreme calorie counting, short-term fasting, and other “under-recognized signs of a potential problem.”

TikTok has also prohibited “deadnaming, misgendering, or misogyny as well as content that supports or promotes conversion therapy programs.” TikTok has been deleting videos that fall under these categories before but is now spelling it out that they are not welcome on its platform.

Written by HackerVibes

IRS backs down on authenticating taxpayers with facial recognition

SpaceX loses 40 Starlink satellites to geomagnetic storm