Facebook said on Monday that it had paused development of an “Instagram Kids” service that would be tailored for children 13 years old or younger amid questions about the app’s effect on young people’s mental health.
The announcement comes ahead of a congressional hearing this week about internal research conducted by Facebook, and reported in The Wall Street Journal, that showed the harmful mental health effects Instagram was having on teenage girls.
Facebook said it still wanted to build an Instagram product intended for children that would have a more “age appropriate experience,” but was postponing the plans in the face of the outside criticism.
“This will give us time to work with parents, experts, policymakers and regulators, to listen to their concerns, and to demonstrate the value and importance of this project for younger teens online today,” Adam Mosseri, the head of Instagram, wrote in a blog post.
Facebook has argued that young people are using Instagram anyway, despite age-requirement rules, so it would be better to develop a version more suitable for them. Facebook said the “kids” app was intended for those age 10 to 12 and would require parental permission to join, forgo ads and carry more age-appropriate content and features. Parents would be able to control what accounts their child followed. YouTube, which is owned by Google, has released a children’s version of its app.
But since it became public earlier this year that Facebook was working on the app, the company has faced criticism from policymakers, regulators, child safety groups and consumer rights groups. They have argued that it hooks them on the app at a younger age rather than protecting them from problems with the service, including child predatory grooming, bullying and body shaming.
Opposition to Facebook’s plans gained momentum this month when The Journal published a series of articles based on leaked internal documents that showed Facebook knew about many of the harms it was causing. Facebook’s internal research showed that Instagram, in particular, had a negative mental health effect on young people, especially young girls, even while company executives publicly tried to minimize the app’s downsides.
“We need to keep going and ensure this pause becomes permanent,” said Josh Golin, executive director of Fairplay, a Boston-based group that was part of an international coalition of children’s and consumer groups opposed to the new app. “This is a watershed moment for the growing tech accountability movement.”
American policymakers should pass tougher laws to restrict how tech platforms target children, Mr. Golin said. Britain adopted an “Age Appropriate Design Code” last year that requires added privacy protections for digital services used by people under the age of 18.
Mr. Golin called on Facebook to conduct a major public education campaign to tell parents to get their children under the age of 13 off Instagram.
On Thursday, Facebook’s global head of safety, Antigone Davis, is scheduled to testify at a Senate Commerce Committee hearing titled “Protecting Kids Online: Facebook, Instagram, and Mental Health Harms.” Ms. Davis will be questioned about Facebook’s research into the impact of its apps on young people and its work to address risks to young users.
Mr. Mosseri cast the internal research in a positive light, saying it is used to help guide product decisions, including new feature that allow people to pause their account or block certain words that could be used for bullying or harassment.
Facebook said that although plans for a children’s Instagram are postponed, it planned to introduce new parental control features in the coming months.