*SAN FRANCISCO, Sept 17 —* Meta has introduced "Teen Accounts" aimed at enhancing the safety of underage users on Instagram, amid growing concerns about the app’s impact on the mental health of young users. Experts and authorities have criticized the popular photo-sharing platform for contributing to issues such as addiction, bullying, and negative body image.
Antigone Davis, Meta's vice president for safety, stated that "Teen Accounts" are a significant update intended to provide parents with peace of mind. With the new policy, users aged 13 to 17 will have private accounts by default, featuring stricter controls on who can contact them and what content they can view. Users aged 13 to 15 who wish to adopt a more public profile must obtain parental consent, and these rules apply to both existing and new users of the platform.
“This is a big change, and we need to ensure its successful implementation,” Davis remarked.
Amid mounting pressure on Meta, particularly from U.S. lawmakers, approximately 40 states filed a complaint last October, alleging that the company’s platforms harm young people's mental and physical health due to addiction and cyberbullying. In response, Australia is planning to set a minimum age for social network users between 14 and 16.
Currently, Meta does not verify the ages of all users to maintain confidentiality. Davis explained that while they will request age verification if there’s strong evidence of incorrect age claims, they prefer not to burden all three billion users with ID checks. She suggested that age verification could be more effectively handled through mobile operating systems like Google’s Android or Apple’s iOS, which already have access to significant user age data.
However, it remains uncertain if these new measures will satisfy governments and online safety advocates like Matthew Bergman, founder of the Social Media Victims Law Center. He has highlighted Instagram's addictive nature, which can lead users down harmful paths and has been linked to tragic outcomes for some children, including suicides encouraged by algorithm-recommended videos.
Bergman noted that many young girls have developed serious eating disorders, although Meta has implemented restrictions on promoting extreme diets and other harmful content in recent years. He characterized these changes as “baby steps,” but acknowledged they are a move in the right direction. Ultimately, he believes that making platforms less addictive could enhance safety without compromising their overall quality for users.