Forty-four attorneys general, from all but 6 of the nation's states, are urging Facebook (NASDAQ: FB) to drop its plans to launch a kid-branded version of Instagram, arguing that such an app could harm kids' mental and health and expose them to danger from child predators.
Facebook will target the new app at children under 13, for whom Instagram is currently off-limits. The company claims that it's consulting with child development experts and safety advocates in building out the app and that the new service will be free of advertisements.
Facebook isn't unfamiliar with developing COPPA-compliant versions of its most popular services. In 2017 the company launched a kids version of its Messenger app, which was complete with parental controls, allowing parents to approve whom their children could or could not talk to through the app. Instagram Kids likely offer a similar set of features if and when it launches.
Buzzfeed News was the first to break the news that Instagram Kids was in development. In March, the group uncovered an internal company post from Instagrams VP of product, Vishal Shah charged developers at the company with building out the proposed service.
Controversy soon followed. Lawmakers grilled Mr.Zuckerberg back in March. In his defense, Mr. Zuckerberg claimed he lets his children use the company's Messenger Kids Service. The implication being that he'd have no problem allowing his children to use Instagram Kids. Later, in April, a coalition of child safety concerns sent a letter to Mr. Zuckerberg highlighting the potential harm the app could inflict on children "who are in the midst of crucial stages of developing their sense of self," according to Vox.
Facebook's argument for the service can be boiled down to a single sentence: kids under 13 are already using Instagram, but at least with Instagram Kids, parents could have some say in how their children use the app.
To its credit, Facebook has also taken proactive steps to patch exploits in its existing services that might have potentially exposed children to danger. Steps such as limiting messages between teens and adults and making it harder for adults to find and follow teens on Instagram.
But for 44 of the nation's attorneys general, these measures simply aren't enough. At the heart of the coalition's letter is the notion that social media is bad for kids' mental health, self-esteem, and privacy.
"Use of social media can be detrimental to the health and well-being of children who are not equipped to navigate the challenges of having a social media account," goes the letter's opening argument.
On self-esteem, the letter cites a 2017 survey by anti-bullying nonprofit Ditch the Label. The survey found that 42% of Instagram users experienced some form of cyberbullying on Instagram, the highest level on any platform.
Of course, the AGs' also mentions the risk of child abuse and sexual exploitation that could stem from kids' use of social media. In 2020 alone, Facebook had to scrub 20 million images depicting the sexual abuse of minors from both Facebook and Instagram, the letter points out.
Even with almost all of the nation's top law enforcement officials squarely against the idea of Instagram Kids, Facebook seems intent on pursuing it. The company has given no clear timeline as to when the app could launch; but when if and when the app does launch don't expect policymakers to go easy on Facebook.