Internet Law

Lawsuits against social media companies use product-liability theory to sidestep immunity law

  •  
  •  
  •  
  • Print

social media buttons

Image from Shutterstock.

Social media companies should be liable under product-liability law for designing their products in a way that addicts their users and causes other harms, according to a growing number of lawsuits.

Law.com has a story on the suits, which contend that the companies aren’t protected by Section 230 of the Communications Decency Act for their poor design. Section 230 protects technology companies for content posted by others.

Whether the suits will succeed will depend on the harms alleged and the jurisdictions where lawsuits are filed, according to Jeffrey Goodman of Saltz Mongeluzzi & Bendesky, a Philadelphia personal injury firm.

“How product liability works is different in all 50 states,” he told Law.com.

The theory got a boost in May 2021, when the 9th U.S. Circuit Court of Appeals at San Francisco ruled against Snapchat in a suit alleging that three youths died in a car crash after using the app’s speed filter. The 9th Circuit said Snapchat didn’t have Section 230 immunity because the lawsuit was based on a negligent design claim, rather than third-party content.

Goodman has filed a product-liability suit alleging that TikTok is liable for the death of a 10-year-old girl who died after trying a viral “blackout challenge” on the platform, according to prior Law.com coverage. The suit claims that TikTok isn’t protected by Section 230 because it created an algorithm that seeks to create user addiction among children who watch and share viral challenges.

Other lawyers who have filed suits using a product-liability theory include Carrie Goldberg of C.A. Goldberg in New York, Alabama lawyer Joseph VanZandt of Beasley Allen and Matthew Bergman of the Social Media Victims Law Center, according to Law.com.

One issue in the cases is whether social media apps are products covered by product-liability law. A motion filed by Instagram parent company Meta in federal court in San Francisco seeks to toss a case on that ground. The suit claimed that Meta created algorithms promoting content dangerous to the mental health of a girl who killed herself.

The motion argues against expanding product-liability law to cover apps with a quote from another case.

“The purposes served by products-liability law … are focused on the tangible world and do not take into consideration the unique characteristics of ideas and expressions,” the motion says.

Give us feedback, share a story tip or update, or report an error.