Techno Blender
Digitally Yours.

Meta to limit kid’s access to sensitive content on its apps

0 16


Meta and its subset social platforms  Instagram and Facebook have been a target of public criticism over children’s safety and exposing them to unpleasant content for a while. However, in recent years, the firm has taken some good steps to limit kids’ exposure to sensitive content by adding various settings and options for both kids and parents.

But the latest addition to Meta’s safety procedure aims to keep suicide and eating disorder content away from teenagers even more. According to Androidcentral, Meta announced a new set of privacy and safety policies on Tuesday in order to show more age-appropriate content to teenagers. The new policy will apply to all under 18 users on Instagram and Facebook. Meanwhile, a global rollout might take a few months.

Teen Instagram and Facebook users have more restricted access to self-harm and eating disorder content

As per the company’s blog post, content with self-harm and eating disorder materials will be removed from user’s Instagram and Facebook feeds. Both platforms have already limited kids’s access to such content on Reels and Explore. However, the restriction is now expanding to Feed and Stories. Meta says it will restrict the content even if it’s shared by someone the user follows.

Additionally, if an underage user posts about dealing with self-harm and eating disorders, the company would refer them to expert organizations like the National Alliance on Mental Illness. Meta added the new policy is in line with expert guidance. Also, they’ll continue to hide more search results related to suicide, self-harm, and eating disorders on Instagram.

Another change is to the recommendation settings for teens. Meta now automatically puts users under 18 into the most restrictive content control setting on Instagram and Facebook. The company previously applied this policy to newly registered users. However, it’s now expanding it to all underage users, regardless of their sign-up period.

The settings aim to limit teenager’s access to sensitive content on Search and Explore. As a side note, the setting is known as “Sensitive Content Control” on Instagram and “Reduce” on Facebook. Finally, Meta said it sends users notifications to inform them of the latest privacy settings.


Meta and its subset social platforms  Instagram and Facebook have been a target of public criticism over children’s safety and exposing them to unpleasant content for a while. However, in recent years, the firm has taken some good steps to limit kids’ exposure to sensitive content by adding various settings and options for both kids and parents.

But the latest addition to Meta’s safety procedure aims to keep suicide and eating disorder content away from teenagers even more. According to Androidcentral, Meta announced a new set of privacy and safety policies on Tuesday in order to show more age-appropriate content to teenagers. The new policy will apply to all under 18 users on Instagram and Facebook. Meanwhile, a global rollout might take a few months.

Teen Instagram and Facebook users have more restricted access to self-harm and eating disorder content

As per the company’s blog post, content with self-harm and eating disorder materials will be removed from user’s Instagram and Facebook feeds. Both platforms have already limited kids’s access to such content on Reels and Explore. However, the restriction is now expanding to Feed and Stories. Meta says it will restrict the content even if it’s shared by someone the user follows.

Additionally, if an underage user posts about dealing with self-harm and eating disorders, the company would refer them to expert organizations like the National Alliance on Mental Illness. Meta added the new policy is in line with expert guidance. Also, they’ll continue to hide more search results related to suicide, self-harm, and eating disorders on Instagram.

Another change is to the recommendation settings for teens. Meta now automatically puts users under 18 into the most restrictive content control setting on Instagram and Facebook. The company previously applied this policy to newly registered users. However, it’s now expanding it to all underage users, regardless of their sign-up period.

The settings aim to limit teenager’s access to sensitive content on Search and Explore. As a side note, the setting is known as “Sensitive Content Control” on Instagram and “Reduce” on Facebook. Finally, Meta said it sends users notifications to inform them of the latest privacy settings.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment