Instagram is launching “teen accounts,” its latest effort to make the platform safer for its younger users amid rising concerns about how social media affects youth mental health.
Anyone under 18 who signs up for Instagram or already has an account will be placed into a teen account, which will be private by default and have restrictions on what kinds of content users can view, , the parent company of the social media app.
The changes, announced Sept. 17, come as Meta faces multiple lawsuits from states and school districts claiming that the company knowingly ignored the negative impact of its platforms on young people’s mental health.
The announcement also arrives eight months after Meta said it was making it harder for teenagers to view content on its platforms that was related to self-harm, suicide, nudity, or eating disorders, even if it’s posted by someone they follow. The company at the time said those changes would be implemented on Instagram and Facebook within months.
Instagram teen accounts will have the strictest settings by default
Along with making teens’ accounts private (meaning they’d have to accept who can follow and see their account) by default, the new Instagram changes will make it so teens can only receive messages from people they follow or are already connected to, according to Meta. And “sensitive content,” such as videos of people fighting or those promoting cosmetic procedures, will be limited, Meta said.
Teens will get notifications if they’re on Instagram for more than 60 minutes each day, and a “sleep mode” will be enabled that turns off notifications and sends auto-replies to direct messages from 10 p.m. to 7 a.m., Meta said.
These settings will be turned on for all teens. Sixteen- and 17-year-olds will be able to turn them off, while kids under 16 will need their parents’ permission to do so.
Meta acknowledged that teenagers may lie about their age and said it will require them to verify their ages in more instances, like if they try to create a new account with an adult birthday. Meta said it uses several ways to verify age: Teens can upload their ID, record a video selfie, or ask mutual friends to verify their age. The company also said it is building technology that proactively finds accounts of teens who pretend to be adults and automatically places them into the restricted teen accounts.
These changes “can be really beneficial,” said Amelia Vance, the president of the Public Interest Privacy Center, which advocates for effective, ethical, and equitable privacy safeguards for all children and students. “I’ve heard a lot of parents and teens express that they do want a level of security or oversight.”
Meta tries to balance parental controls with teen autonomy
Anjali Verma, the president of the National Student Council and a senior at a charter school in West Chester, Pa., is on board with private accounts for teens under 16. In fact, she had a private account until she turned 17.
“It’s really important that Instagram and Meta are taking the steps to be proactive about protecting teens online,” Anjali said.
But she’s skeptical of how effective these actions really will be. There’s still work Meta could do to curb “the most addictive” parts of the app, such as endless scrolling and videos that pop up and immediately begin playing one after another, Anjali said.
Anjali said she’s also unsure about whether it’s a good idea to have parents tied to their teens’ accounts.
“I don’t think all teens necessarily have the best relationships with their parents,” she said. For some teens, social media is their outlet to express what they might not be comfortable sharing with their parents or guardians.
Balancing parental control with giving teens autonomy as they learn and grow is something social media companies and any regulations need to be careful about, Vance said.
Yvonne Johnson, the president of the National Parent Teacher Association, applauded the changes to Instagram in a statement in Meta’s press release, saying that these steps “empower parents and deliver safer, more age-appropriate experiences on the platform.”
Not all parents think Meta’s changes are enough, however.
“This is nowhere near sufficient to address the deep concerns parents and families have about social media,” said Keri Rodrigues, the president of the National Parents Union. “Teens will always find a way around these things. Kids have been doing this for a very long time, and parents do not trust that this is going to be sufficient.”
What Congress is doing about children’s online safety
It’s also not enough to trust that Meta and other social media companies will self-regulate to make sure people are safe on their platforms, Rodrigues said. There need to be laws in place that hold companies accountable for what’s happening on their platforms, she said.
Congress has been considering a couple bills related to children’s online safety: The Kids Online Safety Act (KOSA), which would require social media companies to take reasonable steps to prevent and mitigate harms to children, and the Children and Teens’ Online Privacy Protection Act (also known as COPPA 2.0), which would update online data privacy rules.
The Senate in late July passed the , which combines KOSA and COPPA 2.0. The House has yet to vote on its versions of KOSA and COPPA 2.0.
Even with these changes from Meta and possible social media regulations, experts say it’s still important to teach kids how to navigate the digital world and ensure that they have the skills they need to keep themselves safe.
“It’s the kids who work around [those changes] that I would be worried about,” said Beth Houf, the principal of Capital City High School in Jefferson City, Mo. “So how are we continuing to bring the education lens to this?”