Meta announces greater protection for minors on Instagram and Messenger

SAN FRANCISCO– Starting this Thursday, Meta began blocking messages from strangers sent directly to minors who use Instagram and Messenger.

By default, those under 16 can only be contacted through messages or added to group chats by those who already follow or are connected to them, the company said in a publication.

This setting change will require approval through “parental monitoring tools” within the apps, Meta explained.

He added that it works as a way to prevent minors from seeing unwanted content or potentially inappropriate images in direct messages.

“We have more to share about this functionality, which will even work in encrypted chats by the end of the year,” the company anticipated.

federal regulators and lawmakers call for tighter restrictions

Earlier this month, Meta tightened restrictions on content for minors on Instagram and Facebook as it faces criticism over how its platforms are harmful to young people.

This type of content would include content related to suicide or self-harm, as well as nudity.

It will also apply to restrictions such as the promotion of tobacco, alcohol, cosmetic procedures, weight loss programs, etc.

Additionally, all minors will have the most restrictive settings on Instagram and Facebook by default, a policy that had already been applied to new users.

The changes come months after dozens of states in the United States accused Meta of harming the mental health of children and young people, and of misleading users about the safety of its platforms.

Source: AFP

Tarun Kumar

I'm Tarun Kumar, and I'm passionate about writing engaging content for businesses. I specialize in topics like news, showbiz, technology, travel, food and more.

Leave a Reply