The Chinese social network TikTok confirmed this Wednesday to the American media The Verge that it will globally prevent minors 18 years old use effects that alter the appearance, in reference to the filters of beauty, a measure taken to protect your mental health.
TikTok indicated yesterday in a statement that the measure will be imposed “in the coming weeks” and that it responds to a study it commissioned from the British NGO Internet Matters, published yesterday, which raises concerns about the impact of these filters in the sense of identity of the minors.
The social network, which held a forum this week on security in Dublinsaid there is a “clear distinction” between the filters designed to be “obvious and fun”, such as those that add animal features, and those that “alter your appearance” in ways that are almost undetectable to anyone looking.
Restrict use
TikTokin its statement on the forum, promised to restrict the use of “some effects of appearance” to the minors 18 years old and the head of Public Security and Wellbeing Policies in Europe, Nikki Soo, confirmed that the measure will be applied worldwide, reports The Verge.
The platform maintains that it already “proactively” tells users when certain devices have been used. effects in the content they see, but now it will give “more information about how an effect can change its appearance” and will influence the understanding of the “undesired results” of these.
Last month, fourteen US state prosecutors denounced TikTok for harming the mental health children and accused her of using an addictive content system to profit from younger users, directly targeting the use of this type of filters.
Studies support its harm
Specifically, they denounced that the “filters “beauty” can lower self-esteem, especially that of girls minorsand they cited studies according to which 50% do not look pretty without editing their faces and 77% say they try to change or hide some part of their body with that tool.
- TikTok revealed in its note that it has 175 million monthly users in Europe and that it deletes 6 million accounts each month created by minors 13 (its minimum age), so it collaborates with NGOs, legislators and regulators to reinforce its barriers, and is considering using “machine learning” technology.
It also announced that in the coming weeks it will offer its users in 13 European countries local helplines with experts when they “report content in the app about suicide, self-harm and harassment”, which the platform already investigate and eliminate if you detect that your policies have been violated.