Meta has announced new measures to strengthen safety on child-facing Instagram accounts run by adults. Specifically, these changes focus on protecting children under the age of 13 who appear in photos and videos on these accounts. While many of these accounts serve positive purposes, Meta recognizes the risks from bad actors who exploit them.
To combat misuse, Meta will automatically enable its strictest message controls. This move aims to block unwanted direct messages that violate community guidelines. In addition, Meta will activate the “Hidden Words” filter to catch and hide offensive comments on these accounts.
Furthermore, Meta plans to notify account holders with a message at the top of their Instagram feed. This alert will inform them about the new safety settings and encourage them to review their privacy options. The company expects to roll out these updates gradually over the next few months.
Despite Meta’s claim that most adult-run child-facing Instagram accounts are harmless, the platform faced serious accusations in 2023. For example, lawsuits alleged that Meta knowingly allowed some accounts to facilitate the sexual exploitation of children. In one instance, an account posing as a 13-year-old girl attracted thousands of adult male followers.
Moreover, investigations by major news outlets revealed troubling patterns. Instagram’s recommendation system reportedly pushed users towards networks linked to pedophilia. Additionally, the platform faced criticism for enabling the sharing and sale of illegal child abuse material.
By expanding teen safety features to adult-run accounts with children, Meta targets parents and talent managers who share images of minors. As a result, the company vows to stop recommending these accounts to suspicious adults and to block their interactions. Furthermore, Meta will restrict how these adults find each other through search functions and hide their comments.
These new protections build upon earlier updates that barred accounts featuring children from offering subscriptions or receiving gifts. Ultimately, Meta hopes these steps will reduce abuse and make the platform safer for minors.
For more tech updates, visit DC Brief.