Facebook is putting a hold on the development of a kids' version of Instagram, geared toward children under 13, to address concerns that have been raised about the vulnerability of younger users.
"I still firmly believe that it's a good thing to build a version of Instagram that's designed to be safe for tweens, but we want to take the time to talk to parents and researchers and safety experts and get to more consensus about how to move forward," said Adam Mosseri, the head of Instagram, in an interview Monday on NBC's "Today" show (who also posted to Instagram on the topic - see below).
The announcement follows an investigative series by The Wall Street Journal which reported that Facebook was aware that the use of Instagram by some teenage girls led to mental health issues and anxiety.
The development of Instagram for a younger audience was met with broader opposition almost immediately.
Facebook announced the development of an Instagram Kids app in March, saying at the time that it was "exploring a parent-controlled experience."
Two months later, a bipartisan group of 44 attorneys general wrote to Facebook CEO Mark Zuckerberg, urging him to abandon the project, citing the well being of children.
They cited increased cyberbullying, possible vulnerability to online predators, and what they called Facebook's "checkered record" in protecting children on its platforms. Facebook faced similar criticism in 2017 when it launched the Messenger Kids app, touted as a way for children to chat with family members and friends approved by parents.
Josh Golin, executive director of children's digital advocacy group Fairplay, urged the company Monday to permanently pull the plug on the app. So did a group of Democratic members of Congress.
"Facebook is heeding our calls to stop plowing ahead with plans to launch a version of Instagram for kids," tweeted Massachusetts Sen. Ed Markey. "But a 'pause' is insufficient. Facebook must completely abandon this project."
The Senate had already planned a hearing Thursday with Facebook's global safety head, Antigone Davis, to address what the company knows about how Instagram affects the mental health of younger users.
Mosseri maintained Monday that the company believes it's better for children under 13 to have a specific platform for age-appropriate content, and that other companies like TikTok and YouTube have app versions for that age group.
He said in a blog post that it's better to have a version of Instagram where parents can supervise and control their experience rather than relying on the company's ability to verify if kids are old enough to use the app.
Mosseri said that Instagram for kids is meant for those between the ages of 10 and 12, not younger. It will require parental permission to join, be ad-free, and will include age-appropriate content and features. Parents will be able to supervise the time their children spend on the app, oversee who can message them, who can follow them and who they can follow.
While work is being paused on Instagram Kids, the company will be expanding opt-in parental supervision tools to teen accounts of those 13 and older. More details on these tools will be disclosed in the coming months, Mosseri said.
This isn't the first time Facebook has received backlash for a product aimed at children. Child development experts urged the company to shut down its Messenger Kids app in 2018, saying it was not responding to a "need" as Facebook insisted but creating one instead.
In that case, Facebook went ahead with the app.