(FILES) In this photo illustration, a person looks at a smart phone with a Instagram logo displayed on the screen, on 17 August 2021, in Arlington, Virginia. Meta on 17 September 2024, announced the creation of "Teen Accounts," designed to better protect underage users from the dangers associated with Instagram. The hugely popular photo-sharing app is accused by many experts and authorities of damaging the mental health of its youngest users. OLIVIER DOULIERY / AFP
WORLD

Instagram tightens teen protections amid mounting pressure

Agence France-Presse, Anna Price

Meta has announced a significant update to Instagram's policies, introducing "Teen Accounts" to enhance protections for younger users amid growing concerns about the app's impact on their mental health. This move comes as experts and authorities continue to scrutinize social media platforms for their role in exacerbating issues like addiction, cyberbullying, and eating disorders among teenagers.

Under the new policy, users aged 13 to 15 will have private accounts by default. This means their profiles will be visible only to approved followers, and they will have stricter controls over who can contact them and what content they can access. Teens who wish to maintain a public profile and reduce these restrictions, perhaps to pursue influencer careers, will need parental consent. This change affects both new and existing users, reflecting Meta’s commitment to addressing concerns about youth safety on the platform.

Antigone Davis, Meta's vice-president for safety, emphasized the importance of this update, stating, "‘Teen Accounts’ is a significant update, designed to give parents peace of mind." She acknowledged the need for a robust implementation of these new features to ensure their effectiveness.

This policy change comes amid escalating global pressure on Meta. Last October, a coalition of forty U.S. states filed a lawsuit accusing Meta’s platforms, including Instagram, of contributing to the mental and physical health decline in young users. They cited issues such as addictive behavior, cyberbullying, and severe eating disorders. Meanwhile, Australia is set to introduce new regulations that will increase the minimum age for social media users to between 14 and 16.

Despite these pressures, Meta has opted against implementing widespread age verification for all users, citing concerns about user privacy and the logistical challenges of verifying three billion accounts. Instead, Davis suggested that age verification could be more effectively managed through mobile operating systems like Google’s Android or Apple’s iOS, which already have access to users' age information.

The effectiveness of these new protections remains uncertain. Critics like Matthew Bergman, founder of the Social Media Victims Law Center, argue that while the new measures are a step in the right direction, they may not be sufficient. Bergman’s organization, which represents parents of children who committed suicide allegedly influenced by social media content, has long criticized Instagram for its role in fostering dangerous online environments. He advocates for broader changes to make platforms less addictive and less profitable, emphasizing that such reforms could reduce the negative impact on young users without compromising the platforms' quality.

As Meta moves forward with its updated policies, the spotlight will remain on how well these changes address the serious concerns surrounding adolescent mental health and online safety.

(Source: Julie JAMMOT, Agence France-Presse)