Lawsuit Claims TikTok Algorithm Promotes Violent Videos to Minorities

TikTok has been hit with another lawsuit, this time over the death of a 14-year-old African-American girl.

The lawsuit claims that the platform’s algorithm promotes violent videos to minorities.

Meta Platforms Inc., Snap Inc., and TikTok parent company ByteDance Ltd. are also named as defendants in the lawsuit, which was filed on Wednesday in a San Francisco federal court.

The girl, Englyn Roberts, died in September 2020 after a suicide attempt two weeks before. Lawyers representing her parents claim that TikTok “is aware of biases in its algorithm relating to race and socio-economic status,” according to a report from Mercury News.

Her parents, Brandy and Toney Roberts, bought her a phone when she was ten years old and knew her passwords to be able to check her social media — but it was not enough. The girl was subjected to bullying online that they were never aware of.

“We would regulate, you know her phone, she would turn it in. We had the passwords to all her different social media accounts,” the parents told WAFB.

They added that, “she became 13, we went on a cruise, and she was just adamant about having to have the phone like she couldn’t put it down, and then on a cruise most of the time you do not have Internet access, so she was really upset that she wouldn’t be able to get on her phone.”

“As parents, you think you know and you think you are smart because I’m the dad, that when she would leave to go to school, I would check her phone we had the passcode but what I just didn’t know is where the information was, especially on Instagram,” they continued.

The parents no longer think it was simple bullying and that TikTok was deliberately sending harmful content to minorities.

“TikTok’s social media product did direct and promote harmful and violent content in greater numbers to Englyn Roberts than what they promoted and amplified to other, Caucasian users of similar age, gender, and state of residence,” her parents claim.

The young teenager’s parents are being represented by Matthew Bergman, founder of the Social Media Victims Law Center.

“The nature of these algorithms and these products are designed to be addicted, addictive to teenagers and all three of these teenagers who became addicted to their products and consequently, suffered sleep deprivation and anxiety and couldn’t get away from it,” Bergman told WAFB.

Bergman added that “their profit model is based upon maximizing screen time and engagement over and above everything else. The way they do that is by designing algorithms that direct children, children whose brains aren’t fully developed to content children have no business looking at.”

Two other families filed wrongful death lawsuits against the social media giant earlier this month.

Lalani Erika Walton, 8, and Arriani Jaileen Arroyo, 9, both died while attempting the viral “black out challenge” that had went viral on TikTok.

The lawsuit also alleges that the girls were victims of the app’s algorithm.

“According to TikTok, its proprietary algorithm is ‘a recommendation system that delivers content to each user that is likely to be of interest to that particular user…each person’s feed is unique and tailored to that specific individual,'” the documents, obtained by Radar Online state. “In other words, TikTok has specifically curated and determined that these Blackout Challenge videos – videos featuring users who purposefully strangulate themselves until losing consciousness – are appropriate and fitting for small children.”

On February 26, 2021, Arroyo was found “hanging from the family dog’s leash.” She was found alive, but had to be placed on a ventilator and life support at the hospital. She was left with no brain function at all. Eventually, the family decided to take her off of life support and she passed away.

Five months later, on July 15, 2021, Walton was watching videos during a car ride and discovered the “blackout challenge.” When they got home, they went swimming and the stepmother went to take a nap.

When she woke up, she found the eight year old “hanging from her bed with a rope around her neck.”

The court filing says that the girl was “under the belief that if she posted a video of herself doing the Blackout Challenge, then she would become famous.”

The court filing contends that TikTok designs its algorithms to “addict users and cause them to spend as much time on the application as possible through advanced analytics that create a variable reward system tailored to user’s viewing habits and interests.”

The families also argued that the social media giant does not do enough to keep children off their platform.

“TikTok purports to have a minimum age requirement of 13-years-old but does little to verify user’s age or enforce its age limitations despite having actual knowledge that use by underage users is widespread. TikTok knows that hundreds of thousands of children as young as six years old are currently using its social media product but undertakes no attempt to identify such users and terminate their usage,” the families claim.

The lawsuit additionally argues that TikTok does nothing to “prevent young users from being affirmatively directed to highly dangerous content such as the Blackout Challenges.”

“TikTok has invested billions of dollars to intentionally design and develop its product to encourage, enable, and push content to teens and children that Defendant knows to be problematic and highly detrimental to its minor users’ mental health,” the lawsuit says, according to Radar’s report.

 

Thanks for sharing!