Attorney General Knudsen files lawsuit against Instagram parent company Meta

Attorney General Knudsen files lawsuit against Instagram parent company Meta

HELENA – Montana Attorney General Austin Knudsen announced today the State is suing Meta, the parent company of Facebook and Instagram, in federal court alleging that the company’s misrepresentations of content available on Instagram violate Montana’s consumer protection laws and that its data collection practices violate the federal law. The lawsuit also alleges that Instagram was intentionally designed to be addictive, particularly to minors, and that this addiction harms Montana minors by substantially affecting their mental health. The lawsuit follows an investigation that Attorney General Knudsen launched in 2021.

The lawsuit, filed in the U.S. District Court for the District of Montana, seeks a preliminary and a permanent injunction to compel Instagram to cease its deceptive and unfair statements about the frequency and severity of drug and alcohol content, sexual content, nudity, mature/suggestive themes, and profanity on the Instagram platform; its inaccurate age-ratings in the App Store and other online marketplaces, and its deceptive public assurances in the Instagram Community Guidelines and elsewhere. Because these misrepresentations violate Montana’s Consumer Protection Act, the State may receive civil penalties of up to $10,000 for each violation. The lawsuit also seeks a permanent injunction to compel compliance with COPPA’s parental consent provisions.

“Meta must be held accountable for its deceptive practices and the harm it has caused. Instagram’s intentional addictive design and its failure to address the rampant presence of harmful content on its platform, including explicit drug promotion and sexual exploitation, pose serious risks to the mental health and well-being of young Montanans,” Attorney General Knudsen said. “Internal documents leaked to the public show that Meta was aware of the harms it inflicted on minors, but instead of taking corrective action, the company continued to tell the parents it was a safe product for their kids to use. I am committed to safeguarding the rights and protection of Montanans: social media companies must comply with our state laws and prioritize the safety and privacy of our citizens.”

Instagram was intentionally designed to be addicting, especially for minors who are referred to in internal company documents as a “valuable, but untapped” user base. Despite Instagram’s addictive nature and design, Meta has conveyed misleadingly that it is safe for users under 17, evidenced by a “12+” rating in the App Store and a “T” for “Teen” rating in other stores. The company’s failure to disclose the addictive nature of Instagram, particularly to minors, constitutes deceptive acts that adversely impact the mental health and overall well-being of Montana minors. The intentional design for addictiveness, as alleged, underscores Meta’s knowing engagement in these harmful actions.

Meta tells consumers that Instagram contains only “infrequent” or “mild” content related to things like drug use, sexual content or nudity, and other mature themes and claims ratings in online marketplaces like the App Store that it is suitable for users aged 12 and older. These representations, however, are patently false as the company allows “sexual content and nudity, alcohol, tobacco, and drug use and references, and mature/suggestive themes on the Instagram platform, including readily accessible hardcore pornography.” The platform has also been found to promote or allow child pornography, sexual extortion of teenagers, and open dealing of opioids and other drugs.

Despite advertising itself as only having “infrequent/mild” sexual content and nudity:

  • A Digital Citizens Alliance investigative report revealed drugs being advertised on Instagram and evidenced the ease with which a user can purchase drugs via its direct messaging platform. As researchers followed more drug dealers, Instagram’s algorithms amplified the problem – push more drug sellers toward the researcher’s account;
  • Other investigative reporters found that minor users were able to easily search for age-restricted and illegal drugs, and Instagram’s algorithm pushed accounts of drug dealers selling opioids and other drugs;
  • Instagram violates its Community Guidelines by failing to remove offending content when it is made aware of violating drug content; and
  • The State’s investigation also revealed that extreme drug content is rampant, and that drug content is easy to find and to access.

Despite advertising itself as only having “infrequent/mild” sexual content and nudity:

  • The State’s own research using a text account revealed that nudity, sexual content, and pornography is widespread and easily accessed on Instagram;
  • Instagram imposes no meaningful barriers to prevent children from being exposed to or searching for pornography; and
  • Underage sex content is prevalent and easily accessible, and even actively promoted by the platform’s algorithm. Researchers have concluded that “The problem [with child sex abuse material] on Instagram is particularly severe” compared to other social platforms.

Despite telling consumers in the App Store that “mature/suggestive themes” are “infrequent/mild:”

  • Meta is aware that its product leads young people, especially girls and young women, to mental health crises including self-harm and—in many cases—suicide. Instagram even promotes content that explicitly references and, in some cases, encourages self-harm and suicide;
  • Defendants are also aware that teenagers are being targeted with sexual extortion scams on Instagram. At least a dozen boys died by suicide in 2022 after being blackmailed in this manner, including one in Montana; and
  • Public reporting reveals that Instagram knowingly depicts mature content related to body dysmorphia and eating disorders. Once a younger user has expressed interest in content related to body image or eating disorders, the app recommends more content, creating a spiral for young users.

Additionally, Children’s Online Privacy Protection Act (COPPA) requires operators of certain online services to obtain parental consent before collecting personal information from children under the age of 13 online. Meta has repeatedly violated COPPA by collecting information from children on Instagram without first obtaining – or even attempting to obtain – the informed parental consent that COPPA requires. Further, the company has repeatedly and systematically failed to provide notice to parents about the information Instagram collects from children and how it is used and disclosed.

###

X
Skip to content