The FTC doesn’t want Meta profiting off kids
The FTC wants to stop Meta from debuting new products and services profiting from the data it collects, until they comply with privacy requirements for minors.
The FTC reached a $5 billion settlement in 2019 over allegations that the company violated a 2012 privacy order from the FTC by misleading users regarding their capacity to manage the privacy of their data.
“Facebook has repeatedly violated its privacy promises,” said Samuel Levine, Director of the FTC’s Bureau of Consumer Protection. “The company’s recklessness has put young users at risk, and Facebook needs to answer for its failures.”
The adjustments would prevent Meta from profiting from the data it collects from minors, including data acquired via its virtual reality products. Additionally, it would be subjected to increased restrictions on the use of facial recognition technology and required to provide further security measures for its users.
They include a blanket prohibition against monetizing data of children and teens under 18, a pause on launching new products and services, extending compliance to merged companies, limits on future uses of facial recognition technology, and strengthening existing requirements.
Facebook believes it’s being unfairly targeted.
“Let’s be clear about what the FTC is trying to do: usurp the authority of Congress to set industry-wide standards and instead single out one American company while allowing Chinese companies, like TikTok, to operate without constraint on American soil,” a Facebook spokesperson said.
“FTC Chair Lina Khan’s insistence on using any measure— however baseless— to antagonize American business has reached a new low,” they continued.
TikTok probably isn’t Meta’s best choice to use as a response to its agreement violations. The US government has voiced concerns over the social media platform and there is proposed legislation that could restrict or ban TikTok outright.
Facebook’s orders & violations
The 2020 privacy order required Facebook to pay a $5 billion civil penalty and expanded the privacy program and an independent third-party assessor’s role to evaluate the effectiveness of Facebook’s program. For example, under the 2020 policy, Facebook must conduct a privacy assessment of any new or altered product, service, or process before implementation and document its risk mitigation determinations.
The independent assessor had found several weaknesses in Facebook’s privacy program according to the Order to Show Cause, which says they pose significant risks to the public.
Facebook promised that the app would let kids only communicate with parental-approved contacts
Furthermore, the FTC has requested that Facebook address the accusations that from late 2017 to mid-2019, it misrepresented to parents the extent of control they had over their children’s communication on the Messenger Kids product. Facebook promised that the app would let kids only communicate with parental-approved contacts.
However, in specific cases, kids could communicate with unapproved contacts in group text chats and video calls. The FTC says these misrepresentations violated the 2012 order, the FTC Act, and the COPPA Rule.
The COPPA Rule says that operators of websites or online services aimed towards children under the age of 13 notify parents and receive their verified parental consent before gathering personal data from the children.
The FTC has officially requested that Meta responds within 30 days to the proposed discoveries from the agency’s investigation as it seeks to make alterations to the 2020 policy. Accordingly, the Commission voted 3-0 to issue the Order to Show Cause, which marks the beginning of a process in which Meta will have a chance to respond.
After reviewing facts and any arguments from Meta, the FTC will find if modifying the 2020 order is in the public’s interest or justified by changed conditions of fact or law.