That Meta is known for poking around where it shouldn’t is something we all know. The tech conglomerate houses data from hundreds of millions of users, and as a society, we’ve already accepted this. However, many forget that this issue doesn’t just affect adults but also children. Now, new court documents testify that since 2019, Meta has been refusing to close accounts of children under 13 while collecting their personal information.
The attorneys general of 33 states in the United States have accused Meta of housing over a million reports related to children under 13 with information extracted from Instagram. These reports included data about the children themselves and their close circles (parents, friends, etc.). Despite these accounts blatantly violating the company’s policies, Meta only “deactivated a fraction of those accounts,” according to the filed lawsuit.
These actions by Meta also did not comply with the Children’s Online Privacy Protection Act (COPPA). This regulation prohibits companies from collecting personal information from children under 13 without explicit parental consent. According to the court documents presented, Meta’s records reveal that a significant portion of Instagram’s audience consists of children under 13.
The lawsuit also accuses Meta of being aware that its algorithm could direct minors to highly harmful content. It’s no secret that the social media platforms we use daily can worsen our mental health and fuel negative comparisons among users.
However, the conglomerate led by Zuckerberg doesn’t seem very keen on reversing the situation, at least not for the moment. In more internal documents, it’s evident that one of Meta’s employees wrote that such controversial content forms part of the “most engaging content (on the Explore page).”