Meta blocks strangers from DMs to teens on Instagram and Facebook



Meta is clamping down further on who sends messages to teen users on Facebook and Instagram.

The social media company, on Thursday, announced new tools and features to limit teens’ abilities to see content on the two platforms that could be sensitive. Foremost among those is a new restriction on who can send them direct messages.

Effective immediately, users under the age of 16 (or 18 in certain unnamed countries) will no longer be able to receive messages from people they’re not friends with on either platform—even if the sender is another teen. The only exceptions will be people in their phone’s Contacts list.

That’s an escalation from a 2021 update that restricted the ability of adults to message users under the age of 18.

“We want teens to have safe, age-appropriate experiences on our apps,” Meta wrote in a blog post. “We’re taking additional steps to help protect teens from unwanted contact by turning off their ability to receive messages from anyone they don’t follow or aren’t connected to, by default. Before a teen can change certain Instagram settings they will now need approval from their parents through Instagram’s parental supervision tools.”

The changes do not seemingly apply to Meta’s other holdings, including Threads and WhatsApp. Snapchat, which is not owned by Meta, says users can arrange their settings to only be contacted by Snapchat friends and people in their phone’s Contacts.

Meta, earlier this month, announced it would automatically place teens into its most restrictive content control category, hiding posts about topics including self-harm, eating disorders, and related topics—even when those posts are shared by accounts they follow. (Meta did not immediately reply to a Fast Company request seeking clarification.)

Meta also began issuing “nighttime nudges” to young users who scroll for more than 10 minutes, suggesting they close the app and go to bed. Teens are unable to turn off those prompts.

Still to come, Meta said, is a new feature “designed to help protect teens from seeing unwanted and potentially inappropriate images in their messages from people they’re already connected to, and to discourage them from sending these types of images themselves.”

The increased restrictions are part of an ongoing series of updates for young users. Late last year, Meta also did away with cross-app communication, preventing Facebook and Instagram users from chatting with each other via direct messages. And last June, it unveiled parental supervision tools for Messenger and Instagram DMs that will let parents see how their child uses the service, as well as any changes to their contact list.

The change in status for the two messenger apps come as Meta faces threats by the European Commission to regulate the Messenger service as a “core platform service” under its Digital Markets Act. That would force Meta to make Messenger not operable with other messaging services. (Facebook is fighting that, saying Messenger is a core part of the Facebook app and not a stand-alone feature, despite the fact that the Messenger app is separate from Facebook.)

It also follows a Wall Street Journal investigation last June that alleged pedophiles used Instagram and its messaging system to purchase and sell sexual content featuring minors.

Social media companies, in general, have been beefing up parental controls at a faster pace since last January, when Surgeon General Vivek Murthy said 13-year olds were too young to join the sites, adding that the mental health impacts could be considerable. For Meta, it was more fuel for the fire after the company has long been fighting allegations that its products are used to harm teens.





Meta is clamping down further on who sends messages to teen users on Facebook and Instagram.

The social media company, on Thursday, announced new tools and features to limit teens’ abilities to see content on the two platforms that could be sensitive. Foremost among those is a new restriction on who can send them direct messages.

Effective immediately, users under the age of 16 (or 18 in certain unnamed countries) will no longer be able to receive messages from people they’re not friends with on either platform—even if the sender is another teen. The only exceptions will be people in their phone’s Contacts list.

That’s an escalation from a 2021 update that restricted the ability of adults to message users under the age of 18.

“We want teens to have safe, age-appropriate experiences on our apps,” Meta wrote in a blog post. “We’re taking additional steps to help protect teens from unwanted contact by turning off their ability to receive messages from anyone they don’t follow or aren’t connected to, by default. Before a teen can change certain Instagram settings they will now need approval from their parents through Instagram’s parental supervision tools.”

The changes do not seemingly apply to Meta’s other holdings, including Threads and WhatsApp. Snapchat, which is not owned by Meta, says users can arrange their settings to only be contacted by Snapchat friends and people in their phone’s Contacts.

Meta, earlier this month, announced it would automatically place teens into its most restrictive content control category, hiding posts about topics including self-harm, eating disorders, and related topics—even when those posts are shared by accounts they follow. (Meta did not immediately reply to a Fast Company request seeking clarification.)

Meta also began issuing “nighttime nudges” to young users who scroll for more than 10 minutes, suggesting they close the app and go to bed. Teens are unable to turn off those prompts.

Still to come, Meta said, is a new feature “designed to help protect teens from seeing unwanted and potentially inappropriate images in their messages from people they’re already connected to, and to discourage them from sending these types of images themselves.”

The increased restrictions are part of an ongoing series of updates for young users. Late last year, Meta also did away with cross-app communication, preventing Facebook and Instagram users from chatting with each other via direct messages. And last June, it unveiled parental supervision tools for Messenger and Instagram DMs that will let parents see how their child uses the service, as well as any changes to their contact list.

The change in status for the two messenger apps come as Meta faces threats by the European Commission to regulate the Messenger service as a “core platform service” under its Digital Markets Act. That would force Meta to make Messenger not operable with other messaging services. (Facebook is fighting that, saying Messenger is a core part of the Facebook app and not a stand-alone feature, despite the fact that the Messenger app is separate from Facebook.)

It also follows a Wall Street Journal investigation last June that alleged pedophiles used Instagram and its messaging system to purchase and sell sexual content featuring minors.

Social media companies, in general, have been beefing up parental controls at a faster pace since last January, when Surgeon General Vivek Murthy said 13-year olds were too young to join the sites, adding that the mental health impacts could be considerable. For Meta, it was more fuel for the fire after the company has long been fighting allegations that its products are used to harm teens.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@technoblender.com. The content will be deleted within 24 hours.
BlocksDMsfacebookInstagramMetaStrangersTechTechnoblenderTechnologyTeens
Comments (0)
Add Comment