LONDON — US regulators have said Facebook misled parents and failed to protect the privacy of children using its Messenger Kids app, including misrepresenting the access it provided to app developers to private user data.
As a result, the Federal Trade Commission on Wednesday proposed sweeping changes to a 2020 privacy order with Facebook — now called Meta — which would prohibit it from profiting from data it collects on users under 18. This would include data collected through its virtual reality products.
The FTC said the company has failed to fully comply with the 2020 order.
Meta would also be subject to other limitations, including on its use of face-recognition technology, and be required to provide additional privacy protections for users.
“Facebook has repeatedly violated its privacy promises,” said Samuel Levine, director of the FTC’s Bureau of Consumer Protection.
“The company’s recklessness has put young users at risk, and Facebook needs to answer for its failures.”
Meta called the announcement a “political stunt”, adding: “Despite three years of continual engagement with the FTC around our agreement, they provided no opportunity to discuss this new, totally unprecedented theory.
“Let’s be clear about what the FTC is trying to do: usurp the authority of Congress to set industry-wide standards and instead single out one American company while allowing Chinese companies, like TikTok, to operate without constraint on American soil.
“We have spent vast resources building and implementing an industry-leading privacy programme under the terms of our FTC agreement. We will vigorously fight this action and expect to prevail.”
Facebook launched Messenger Kids in 2017, pitching it as a way for children to chat with family members and friends approved by their parents.