Parents across the United States are taking a stand against Meta, Inc., the social media giant responsible for Facebook and Instagram operations, with one family in Georgetown joining the fight.
Ben Martin, a Scott County School Resource Officer, and his wife, Jennifer, have joined a number of concerned parents seeking to hold the social media platforms accountable for allegedly knowing the dangerous content it promotes to adolescents. The lawsuits claim Meta’s content and algorithms continue to push harmful content onto youths, which often lead to serious health conditions including eating disorders, depression, addiction and even suicide.
The Martins state in their filing that Meta should be held accountable for “causing and contributing to burgeoning mental health crisis perpetrated upon the children and teenagers of the United States.” They also allege that Instagram created “the perfect storm” of addicting, harmful content that led their daughter, 19-year-old Alexandra Martin, “down a rabbit hole of crisis.”
When Alex opened her first social media account, her parents said she was interested in recipes, something they allege Meta eventually twisted into showcasing unhealthy body image standards and promoted eating disorders or extreme workouts designed to target tweens and teens.
Ben Martin said after hearing Alexis Spence’s story, another teenager involved in a social media lawsuit, he and his wife began researching ways they could help put an end to dangerous, maybe even life-threatening, content driven to bring in engagement and money for Meta. In doing so, they came across the Social Media Victims Law Center (SMVLC) based in California.
According to the firm’s website, the group has moved to hold social media companies like Meta legally accountable for the damage done to vulnerable users.
“SMVLC seeks to apply principles of product liability to force social media companies to elevate consumer safety to the forefront of their economic analysis and design safer platforms that protect users from foreseeable harm. Only civil litigation can force social medias to compensate victims for the harm caused by their products.
“Successful court recoveries on behalf of current social media victims will not only furnish the compensation they need and deserve but also incentivize social media companies to design safer products to avoid having to pay court awards in the future. While government regulation plays an important role in curbing known abuses, the civil justice system can force social media companies to act proactively to include consumer safety in the cost of production,” the firm’s website states.
SMVLC Founding Attorney Matthew P. Bergman said in the Martins’ case, there is nothing that is a coincidence about the trauma she suffered as a result of Meta’s aggressive, targeting algorithms.
“What has happened to her is what is happening to many other thousands of other children around the country who become addicted to social media, who are encouraged to open up illicit, accounts without their parents consent, and who are intentionally directed to harmful and hurtful content through algorithms designed to maximize user engagement, even though they know that it does so by directing children to, in this case, to promote anorexia,” he said.
It varies from product to product, but most social media platforms don’t employ a form of verification to ensure safety for its minor users, like verifying age or number of accounts, Bergman said.
“Secondly, the documents that Francis [Haugen, Facebook whistleblower,] has revealed demonstrate that Instagram refers to multiple accounts opened by kids as a ‘value add,’ and that they know that children can evade their parental control and oversight by opening up multiple accounts,” he said.
“Other platforms, similarly, are designed to invade parental oversight responsibility, most significantly with Snapchat’s disappearing methods feature where kids explicitly are encouraged to use the product because their parents aren’t going to find out what they’re doing. TikTok also allows kids to open multiple accounts on the same URL.”
Companies like Meta profit from user engagement, which is how many times a person visits their product and how long they are there. He added the more time and clicks the company can get from a child increase the business’ profit, Bergman said Because of this, algorithms used on these popular sites are designed to addict children through what’s called an “operant conditioning model,” where selective images are provided, he said.
“What they know from the standpoint of basic neurology is that the dopamine impact that causes kids to stay online and to keep clicking is greater if children are subjected to psychologically discordant material. So in order to keep people on, they have to progressively send them to more and more extreme content.
“A child such as Alex was just, you know, interested in recipes, and in a short matter of time through an algorithmic process, she was directed toward sites that promote body shaming, promote anorexic behavior, and promote a body image that is unnatural and unhealthy,” he said.
Meta and other companies know from their own documents the harm being caused to adolescents, including that one in three females on Instagram feel bad about their bodies, Bergman said. He added that although it has been proven the algorithms and lack of security and parental control add to the promotion and subjection of such harmful content, Meta has done nothing to correct or undo the damage it’s causing.
Although social media giants are seemingly against any change, Bergman said he has never encountered a group of clients more committed to safety and less concerned about money than the parents involved in setting the record straight. He added in his 25 years as a plaintiff lawyer, he’s yet to see anyone as dedicated to preventing other families from going through what they did as they work to hold the companies accountable.
“There’s carnage going on out there with our young people,” he said. “You know, we have a 146 percent increase in suicide, increasing rates of eating disorders, attempted suicide, self-harm and depression. Parents are starting to say enough is enough. We need to stand up and hold these companies accountable. The only way we do that is to affect their bottom line.
“Meta has documents that perhaps haven’t been disclosed, talking about teenagers and says tweens are herd animals. A company that refers to children as animals is unlikely to be a company that’s going to respond to moral persuasion. Rather, they’re going to respond to being held accountable legally, to pay for the impact of the damage that they’ve done. Then and only then will they be incentivized to design safer social media products that provide kids with the kind of material they want and need, as opposed to directing them to this very harmful and destructive material.”
The technology is out there and readily available for companies like Meta to implement that would allow for stricter regulations and verification tasks, Bergman said. He added apps such as Tinder and other dating services utilize such technology for their users.
“If hook-up sites have these technologies readily available, why don’t sites that are specifically geared toward children? Our children are even more entitled to those,” he said.
The Martins said during Alex’s eating disorder, they worked to delete all of her social media because the accounts acted as a trigger for her.
“We actually took her phone away, and we actually monitored her phone, but just the way the algorithms are built… Had we known about that back in 2015, it would have been a lot different and that’s something that Meta needs to be accountable for and put more safety regulations in,” said Jennifer Martin.
“I was very vocal about Alex’s eating disorder. We did not keep it private. Once word got out that Alex was going through this, we had several families reach out to us. We would ask Alex, what advice would you give that child or that parent, and the first thing she would always say is to delete Instagram. Do not be on Instagram because it is such a trigger,” said Ben Martin.
SMVLC Attorney Laura Maruez-Garrett added that officials are still learning about these products, as well, emphasizing how “tricky” it can be for parents to fully monitor what companies like Meta are showing their children.
“Even with the monitoring apps where you can see what messages are going back and forth, what sort of hidden danger here that nobody contemplated, nobody foresaw was the content itself or explore pages and the feed ranking. The only other thing I’d add to that is what we’re also learning now is that more sophisticated parents, at least, can actually aggressively try and reprogram the algorithm.
“It’s based on a long process, but if you go in there, you can actually, by spending an hour every couple of days, reprogram those algorithms. This is the crazy part about the product, you know, is one of the fixes they could do for minors. It’s a programming thing, and they could quickly reset their algorithms so that minors are not getting all of the harmful content. But in the absence of that… It would be spending a couple hours a week doing what you can to at least try and negate some of the impact because that’s how powerful these recommendation technology tools that they’re using are,” she said.
Ben Martin said he is appreciative of his community’s support and of the opportunity to hold Meta accountable, preventing other children from going through something as traumatic as his own daughter.
“Our goal is to reach out to other teens and to have other parents hopefully receive this information and shed some light on what the dangers of Instagram, and specifically Meta, are doing,” he said. “They elected to not do anything about it because that would disengage users’ time on that platform, so it’s very frustrating that they’re putting money over safety.”
Concerned parents in search of information regarding social media and the precautions they can take can visit www.socialmediavictims.org.








