New York, NY – November 28, 2023
Explosive revelations have emerged from recently unsealed court documents in an ongoing federal lawsuit against Meta, shedding light on the company’s alleged willful disregard for child safety and violations of the Children’s Online Privacy Protection Act (COPPA). The documents indicate that Meta, formerly known as Facebook, knowingly allowed harmful content to reach children and collected their personal information without parental consent.
According to the unsealed court papers, Meta refused to shut down the majority of social media accounts belonging to children under the age of 13, a direct violation of COPPA regulations that prohibit the collection of personal data from minors without parental consent. Attorneys general from 33 states have jointly accused Meta of receiving over a million reports of users under the age of 13 on its Instagram platform.
The federal lawsuit, which encompasses allegations from multiple states, accuses Meta of violating various state-based consumer protection statutes in addition to the federal COPPA law. The complaint demands court orders to halt Meta’s practices and warns of civil penalties that could amount to hundreds of millions of dollars.
The court documents also reveal Meta’s purported awareness of the potential harm caused by its algorithm. It is alleged that Meta was cognizant that its algorithm had the capability to guide children towards harmful content, adversely impacting their well-being. The revelation raises serious questions about Meta’s commitment to user safety, particularly among the vulnerable demographic of children.
This latest development adds to the mounting legal challenges and public scrutiny faced by Meta, which has been under increasing pressure to address concerns related to the impact of its platforms on users, especially minors. The potential consequences of this lawsuit extend beyond financial penalties, as Meta may be compelled to implement significant changes to its policies and practices to comply with existing regulations and safeguard the well-being of young users.