A federal judge has dismissed a Pennsylvania mother’s lawsuit against TikTok, in which she claimed the social media giant was responsible for her daughter’s death after she attempted the so-called “Blackout Challenge.”

Visit streaming.thesource.com for more information

The Blackout Challenge is a series of videos distributed through TikTok in which the viewers are encouraged to choke themselves until they pass out.  TikTok is currently facing multiple lawsuits from parents whose children died of strangulation after viewing these videos on the service.  These lawsuits allege that at least seven children died last year attempting the “challenge,” all of whom were under 15 years old.  

The judge’s dismissal of the case centered on the controversial Section 230 of the Communications Decency Act, which gives online platforms immunity against legal claims for content that is published on their services by third parties, such as user-generated videos.  The parent in this case claimed there was a design defect in the algorithm that TikTok uses to present content to its users, and that TikTok had failed to warn viewers that the Blackout Challenge was dangerous and life-threatening.  However, the judge held that she could not get around Section 230 immunity by simply repackaging her claims as product liability claims, and remarked that it was Congress’s job, not his job, to recognize any exceptions to Section 230 immunity.


Following the filing of this lawsuit in 2021, TikTok blocked users from searching for “Blackout Challenge,” instead taking them to a screen with a warning that “[s]ome online challenges can be dangerous, disturbing or even fabricated.”  However, the Blackout Challenge videos still remain on the service and can appear in a user’s feed based on TikTok’s algorithm.

The result in this lawsuit does not bode well for the other similar lawsuits that are still pending, which will very likely meet a similar fate.  Until Section 230 is revisited by Congress, it is unlikely that social media platforms will be required to exercise limitations on content such as the Blackout Challenge that can be directly harmful to children.