UPDATE: Instagram is rolling out a new “PG-13” algorithm that aims to block teens from viewing adult content, announced Meta Platforms this Tuesday. This significant change comes in direct response to years of criticism and lawsuits alleging the platform inadequately protects minors.
Under this new rating system, teens will face stricter controls on what they can see, with the aim of ensuring that content is no more explicit than what is typically found in a PG-13 movie. Meta confirmed that posts with strong language, risky stunts, and marijuana paraphernalia will be either hidden or not recommended to young users. This move is designed to give parents enhanced control over their children’s Instagram experience.
This announcement follows multiple lawsuits against Meta, including a notable case involving 33 states, asserting that the company exploited its platform to attract and engage minors. “Meta has harnessed powerful and unprecedented technologies to entice, engage and ultimately ensnare youth and teens,” the complaint stated.
In a separate lawsuit, a former Apple executive claimed that Meta targeted his 12-year-old child on Facebook, intensifying concerns over child safety. New Mexico Attorney General Raúl Torrez accused Meta of creating a “breeding ground” for child predators on its platforms.
Despite these changes being a step in the right direction, skepticism remains high among child advocacy groups. Ailen Arreaza, executive director of ParentsTogether, expressed concerns about the efficacy of Meta’s initiatives. “While any acknowledgment of the need for age-appropriate content filtering is a step in the right direction, we need to see more than announcements — we need transparent, independent testing and real accountability,” Arreaza stated.
Others, like Charles Rivkin, chairman of the Motion Picture Association, criticized the rating system as inherently flawed. He remarked that while efforts to shield children from inappropriate content are welcome, the connection to traditional movie ratings is misleading.
Meta plans to fully implement these changes across the U.S. by the end of 2023, but the urgency of this issue underscores the ongoing debate regarding the safety of social media platforms for younger audiences. As parents and advocacy groups remain watchful, the effectiveness of these new measures will be closely scrutinized.
Stay tuned for more updates on this developing story as Meta works to navigate the complex landscape of online safety and youth protection.
