NEWS
New Mexico claims Meta ‘puts profits over safety’ as trial begins
Attorney General Raúl Torrez filed suit in 2023 against the company, alleging it failed to protect children from sexual abuse
SANTA FE — New Mexico opened its trial against Meta on Monday, alleging that the parent company of Facebook and Instagram targets teens and preteens as a means of maximizing ad revenue while exposing children to sexual exploitation and other harms.
Donald Migliori, an attorney for the state, said in opening statements Monday that the state will show jurors internal Meta documents that contrast sharply with the company’s public statements about youth safety.
“The theme throughout this trial is going to be that Meta puts profits over safety,” Migliori told jurors. “The evidence in this case will be that Meta has knowingly made false and misleading statements, particularly about the safety of its platforms for teens and preteens. Safety was under prioritized and ineffective.”
Meta responded by listing tools and policies the company has created to protect young users from sexual predators and eliminate harmful and offensive content from its platforms.
“Meta designs its apps to be fun and entertaining, not to harm teens in New Mexico,” said Kevin Huff, a Meta attorney. “Meta succeeds in removing the vast, vast majority of bad content like child exploitation and pro-suicide material. No one wants to see an app where they see offensive content and have bad experiences.”
New Mexico Attorney General Raúl Torrez filed the suit in 2023 against Meta Platforms and its CEO, Mark Zuckerberg, alleging the company failed to protect children from sexual abuse, online solicitation and human trafficking.
Attorneys last week seated an 18-person jury that includes six alternates. The trial before Judge Bryan Biedscheid in 1st Judicial District Court in Santa Fe is expected to last six weeks.
Migliori alleged that Meta places particular value on teens and preteens because they remain engaged for longer periods, bringing the company advertising revenue. “They want tweens on their platforms,” he said.
Migliori gave jurors an “abbreviated presentation” of internal Meta documents obtained by the state that he said contradict public statements made by Zuckerberg and other company officials, saying that Meta places a high priority on youth safety.
“Meta made its profits while publicly misrepresenting that its platforms were safe for you, downplaying or outright lying about what it knows about the dangers of its platforms,” he said.
Migliori alleged that Zuckerberg placed free expression ahead of safety, sharing with jurors an internal document in which Zuckerberg allegedly prioritized freedom of expression over safety.
“It’s not safety first,” he told jurors. “It’s growth and freedom of expression first.”
Migliori also told jurors they will hear testimony from former Meta employees who will say that Meta has few systems in place to prevent teens and preteens from receiving messages from adults.
Meta’s attorney said some 3 billion people worldwide use Meta platforms and acknowledged that harmful content and predators find ways to avoid automated systems.
“Meta has built amazing automated tools to protect people,” Huff told jurors. But some harmful content escapes detection, given that more than 1 billion pictures, comments, texts and videos are posted each day on Facebook and Instagram, he said.
“Meta discloses that its systems aren't perfect and bad content get through its safeguards,” Huff said in opening statements. “Meta is constantly working to improve those systems.”
Huff argued that Meta has 40,000 employees — “enough to fill up University Stadium” — working to make its platforms safe, including a team of 75 former prosecutors and law enforcement officers who have disrupted 37 child exploitation networks nationwide and removed some 200,000 accounts associated with the networks.
“The bottom line is that when Meta becomes aware of child sexual abuse material on its platforms, it reports to the (National Center for Missing and Exploited Children) as quickly as possible,” he said.
As a matter of policy, Meta prohibits children under 13 from opening Facebook and Instagram accounts, Huff said, but acknowledged that many children under 13 lie about their age to open accounts. Meta has developed tools for determining if children under the age of 13 have accounts and shutting them.
“Meta isn’t hiding that there are large numbers of kids on its apps,” he said. “That’s disclosure, not deception.”
Huff also described Facebook’s “teen account” system that limits who can contact users on Facebook Messenger and other controls and filters parents can use to protect children 13-17.
In preparing the suit, New Mexico investigators created fictitious Facebook and Instagram accounts for boys and girls in their early teens that the suit alleges quickly attracted thousands of friends and followers, most of them adult men.
Huff alleged that in creating an account for a “Issa Bee,” a fictitious 13-year-old girl, investigators for the New Mexico Department of Justice used a photo of a real person without informing the person.
“The state’s investigators then used this account with this real woman’s picture to communicate with a person the state had been investigating as a child sex predator,” Huff alleged. “A simple Google search by that predator would have allowed him to find this woman. That’s how we discovered she is a real person. The state put her at risk for this investigation.”
Huff also said that investigators intentionally circumvented Facebook’s safeguards to create the fictitious account, such as listing Issa Bee as an adult.
A spokeswoman for the state Department of Justice did not immediately respond Monday to a request for comment about the allegation.
The New Mexico DOJ brought the suit under the state Unfair Practices Act, which prohibits deceptive trade practices. The law requires the state to prove that Meta knowingly made false or misleading representations about the safety of its products.
Meta argued in court records that it can’t be held liable for content shared by third-party users and that its practices are protected under Section 230 of the federal Communications Decency Act, “as courts nationwide have held.”
Congress passed Section 230 in 1996 to provide immunity from liability to online computer services for third-party content created by users. The law has been interpreted by courts to shield social media companies from liability as a “publisher” of material posted by users.
States and individuals have filed thousands of lawsuits in state and federal courts across the U.S. alleging that social media platforms are designed to lure young users into compulsive use. A trial began earlier this month in Los Angeles Superior Court alleging that Meta and YouTube designed addictive platforms resulting in mental health problems for young users.
New Mexico’s legal challenge differs from many other lawsuits by focusing on sexual exploitation, alleging that Meta “knowingly exposes children to the twin dangers of sexual exploitation and mental health harm.”
Olivier Uyttebrouck covers courts for the Journal. You can reach him at olivier@abqjournal.com.