Facebook's Messenger Kids app had one job: keep children away from strangers. It failed.
In late 2017 when Facebook introduced Facebook Messenger Kids, the app was marketed as a way to let kids communicate with their friends and trusted contacts without being exposed to strangers. As it turns out, the app hasn't been doing a great job at that. This week, Facebook started alerting parents that an apparently "technical error" allowed children to be added to group chats with unapproved strangers.
The way that Facebook Messenger Kids is supposed to work is that parents can create an account for children under the age of 13 and select certain contacts who are approved and can communicate with the young user. A Messenger Kids account doesn't create a Facebook account for the child — instead, it is linked to the parent's account so they can control the entire experience — nor does it require an account to be registered with a phone number, so users who aren't approved to contact a kid through the platform shouldn't be able to find or interact with them at all.
The problem is that the apparent technical flaw in the app created a loophole that allowed kids to end up in group chats with unauthorized users. While one-on-one chats through Messenger Kids can only take place with approved users, it was discovered that one party in the chat could add additional users to the conversation, including those who are not approved to talk to the other kid in the chat. So if your child and their friend were talking, that friend could add another person to the chat who you didn't give permission to contact or talk to your kid.
According to The Verge, Facebook has been quietly shutting down these unapproved group chats, of which there were reportedly thousands. The company has also started alerting parents of the issue, sending out messages indicating that their child was included in an unapproved group chat that has since been shut down. The company has not indicated how long the bug has been present in the app, if it appeared recently or has been present since the app's launch.
The latest issue for Messenger Kids could not have come at a worse time for the company — though at this point, there's no real good time for Facebook to suffer from another privacy failure. The social networking giant is readying for a $5 billion fine from the FTC for a whole rash of apparent privacy violations, including the Cambridge Analytica scandal that allowed a third-party app to suck up personal information from millions of users without their permission or consent. With the magnifying glass still on the company, Facebook could face more scrutiny over the Messenger Kids debacle. The FTC recently cracked down on TikTok (formerly Musical.ly) for collecting data from underage users without parental consent, so the government agency already has its attention turned to how companies are handling young users. COPPA fines are typically no joke and can add up fast with widespread violations, so Facebook may want to start setting aside some more money just in case.