The Supreme Court may have a shot to strike down Section 230 after a child’s death was blamed on a ‘blackout challenge’ seen on TikTok
The tragic accidental death of a 10-year-old could result in new limits on liability protections for social media sites like TikTok.
A 10-year-old child’s tragic death, after her parents alleged in a lawsuit that she participated in a “blackout challenge” served to her on her TikTok “For You Page,” could change the internet as we know it.
In a decision on August 27, the US Third Circuit Court of Appeals found that, in 2021, TikTok — via its “For You Page” algorithm — recommended a video promoting a “blackout challenge” to 10-year-old Nylah Anderson.
“Nylah, still in the first year of her adolescence, likely had no idea what she was doing or that following along with the images on her screen would kill her,” Third Circut Judge Paul Matey wrote in his concurring opinion. “But TikTok knew that Nylah would watch because the company’s customized algorithm placed the videos on her ‘For You Page.'”
The challenge encouraged viewers to choke themselves until they passed out. In a 2022 suit filed against TikTok, attorneys for her parents said Nylah tragically died when she attempted to replicate what she saw.
The case was initially dismissed by a district judge due to Section 230. The Third Circuit, however, found that TikTok’s algorithm, which tailors content recommendations to specific users based on the posts they interact with, is a form of speech — one that isn’t protected by Section 230.
The company had argued in court that it was immune from prosecution due to Section 230 of the Communications Decency Act. TikTok did not immediately respond to a request for comment from B-17.
Section 230 of the Communications Decency Act of 1996, often called the law that created the internet, shields online platforms like TikTok, Meta, X, and others from being held liable for content posted by users of their sites.
That means, for example, if a user posted a video encouraging viewers to hurt themselves, the platform they posted the clip to couldn’t be held liable if a susceptible viewer followed that suggestion.
But the Third Circuit ruling could change that.
Proponents of the ruling say it’s about time
“Imagine that a person walked up to Nylah at school and suggested that she asphyxiate herself. We’d immediately recognize the person’s culpability,” David French, a columnist and former attorney, wrote in a recent opinion piece for The New York Times. “Arguably, the algorithmic suggestion is even more powerful than an in-person suggestion.”
French and other supporters of the Third Circuit ruling argue that TikTok’s liability protections should end where its algorithmic suggestions begin.
Neutrally hosting a wide array of content on an online platform is fine, French and other proponents say, but promoting specific content — especially content the site administrators know could be harmful — is where a new line needs to be drawn, and the platforms themselves need to be held legally accountable, as the Third Circuit ruled.
Section 230 defenders argue the ruling is a blow to free speech
Opponents of the ruling argue that critics of Section 230 are using Anderson’s tragic death and the reasonable desire to protect children online as a way to erode speech rights.
“It’s a myth — these laws that claim to protect children,” Betsy Rosenblatt, the associate director of Case Western University’s Spangenberg Center for Law, Technology & the Arts, told B-17. “They all wear the costume of child protection, but they are not child protection underneath that — they are attempts to silence speech.”
Rosenblatt said the Third Circuit’s decision doesn’t make logical or legal sense. It’s morally reprehensible to suggest in a video a child choke themselves for engagement online — but it isn’t illegal, and it shouldn’t be illegal for the host of the video to show it on your “For You Page.”
“The more you require platforms to filter speech, the more platforms will have to delete first and ask questions later,” Rosenblatt told B-17. “And that means that things that are controversial, the moment they face any challenge, will get taken down, even if they should stay up.”
What’s next?
The Third Circuit’s decision can be appealed by TikTok. If the company appeals, the case will then end up on the Supreme Court’s desk, where the high court justices could choose to take up the case or let the Third Circuit ruling stand — which would force platforms like TikTok to reconsider how their algorithms function to avoid liability in cases like Anderson’s.
Though the high court has so far shied away from defining the scope of Section 230, its conservative justices have previously signaled they’re open to reconsidering the statute. If they do, their ruling could have even broader consequences than the Third Circuit ruling.
Justices Clarence Thomas and Neil Gorsuch dissented in July from the court’s refusal to hear a case that would have revisited Section 230. The case stemmed from allegations that the Snapchat app has a design flaw that aids sexual predators, but lower courts found that the app’s parent company was shielded by 230.
And in a decision last term, SCOTUS also left a loophole open to holding platforms liable based on which country their headquarters are in. In Moody v. NetChoice, which determined a platform’s algorithmic activity is a form of “expressive activity” and should be regulated like speech is, Justice Amy Coney Barrett wrote that a foreign-owned social-media platform — like TikTok — may not have the same First Amendment protections that a US corporation does.
Rosenblatt said that if SCOTUS agrees to hear the case, the high court could agree with the Third Circuit that TikTok’s algorithmic recommendations are a version of the site’s speech. If they do, the question would then become whether the recommendation itself was negligent, which could carry legal consequences.
“That would still be terrible for business on the internet, but it wouldn’t kill all websites,” Rosenblatt said of upholding a more narrow reading of the Third Circuit ruling. But the potential for a broader reading, which could determine that any form of content moderation equals a conversion of user speech into a platform’s speech, would “have devastating effects on the internet ecosystem and tech more generally.”