FTC Accuses Meta of Failing to Protect Children on Messenger Kids

FTC Accuses Meta of Failing to Protect Children on Messenger Kids

On Monday, the Federal Trade Commission (FTC) filed a complaint against Meta Platforms, accusing the social media giant of failing to protect children on its Messenger Kids app. The FTC alleges that Meta violated a 2019 consent order by allowing children to join group chats with adults who were not approved by the child’s parent

On Monday, the Federal Trade Commission (FTC) filed a complaint against Meta Platforms, accusing the social media giant of failing to protect children on its Messenger Kids app. The FTC alleges that Meta violated a 2019 consent order by allowing children to join group chats with adults who were not approved by the child’s parent or guardian.

Messenger Kids is a messaging app designed for children under the age of 13, which requires parental consent to use. The app was launched in 2017, and at the time, Meta said it was created with input from parents and child development experts to provide a safe and fun way for kids to communicate with friends and family.

However, according to the FTC, Messenger Kids had several flaws that allowed unauthorized users to join group chats with children. The FTC claims that between 2019 and 2021, Messenger Kids did not have adequate measures in place to prevent unauthorized users from joining group chats with children, and the app did not notify parents when their child joined a group chat with an unauthorized user.

The complaint also alleges that Meta failed to provide adequate training to employees responsible for reviewing reports of unauthorized users on Messenger Kids, which led to delays in removing these users from the app.

In response to the complaint, Meta said that it disagrees with the FTC’s allegations and that it has taken steps to improve the safety and privacy of Messenger Kids. “We have already made changes to address the issues raised by the FTC and will continue to focus on making Messenger Kids the safest and most secure messaging app for kids,” a Meta spokesperson said in a statement.

The FTC is seeking a permanent injunction that would require Meta to delete all personal information collected from children who used Messenger Kids, as well as monetary penalties.

The complaint against Meta comes as concerns about online safety for children continue to grow, with many parents and experts worried about the impact of social media on young people’s mental health and well-being. The case also highlights the challenges that tech companies face in balancing the need for innovation and growth with the responsibility to protect vulnerable users, particularly children.

Posts Carousel

Leave a Comment

Your email address will not be published. Required fields are marked with *

Latest Posts

Top Authors

Most Commented

Featured Videos