Mass Law Blog

Anderson v. TikTok: A Potential Sea Change for § 230 Immunity

Anderson v. TikTok: A Potential Sea Change for § 230 Immunity

In late August the U.S. Third Circuit Court of Appeals released a far reaching decision, holding that § 230 of the Communications Decency Act (CDA) did not provide a safe harbor for the social media company TikTok when its algorithms recommended and promoted a video which allegedly led a minor to accidentally kill herself. Anderson v. TikTok (3rd Cir. Aug. 27, 2024).

Introduction

First, a brief reminder – § 230, which was enacted in 1996, has been the guardian angel of internet platform owners. The law prohibits courts from treating a provider of an “interactive computer service” i.e., a website, as the “publisher or speaker” of third-party content posted on its platform. 47 U.S.C. § 230(c)(1). Under § 230 websites have been given broad legal protection. § 230 has created what is, in effect, a form of legal exceptionalism for Internet publishers. Without it any social media site (such as Facebook, X) or review site (such as Amazon) would be sued into oblivion.

On the whole the courts have given the law liberal application, dismissing cases against Internet providers under many fact scenarios. However, there is a vocal group that argues that the broad immunity protection given to § 230 of the CDA is based on overzealous interpretations far beyond its original intent.

Right now § 230 has one particularly prominent critic – Supreme Court Justice Clarence Thomas. Justice Thomas has not held back when expressing disagreement with the broad protection the courts have provided under § 230. 

In Malwarebytes, Inc. v. Enigma Software (2020) a petition for writ of certiorari was denied, but Justice Thomas issued a “statement” – 

Nowhere does [§ 230] protect a company that is itself the information content provider . . . And an information content provider is not just the primary author or creator; it is anyone “responsible, in whole or in part, for the creation or development” of the content.

Again in Doe ex rel. Roe v. Snap, Inc. (2024), Justice Thomas dissented from the denial of certiorari and was critical of the scope of § 230, stating – 

In the platforms’ world, they are fully responsible for their websites when it results in constitutional protections, but the moment that responsibility could lead to liability, they can disclaim any obligations and enjoy greater protections from suit than nearly any other industry. The Court should consider if this state of affairs is what § 230 demands. 

With these judicial headwinds, Anderson v. TikTok sailed into the Third Circuit. Even one Supreme Court justice is enough to create a Category Two storm in the legal world. And boy, did the Third Circuit deliver, joining the § 230 opposition and potentially rewriting the rulebook on internet platform immunity.

Anderson v. TikTok

Nylah Anderson, a 10-year-old girl, died after attempting the “Blackout Challenge” she saw on TikTok. The challenge, which encourages users to choke themselves until losing consciousness, appeared on Nylah’s “For You Page”, a feed of videos curated by TikTok’s algorithm.

Nylah’s mother sued TikTok, alleging the company was aware of the challenge and promoted the videos to minors. TikTok defended itself using § 230, arguing that its algorithm shouldn’t strip away its immunity for content posted by others.

The district court dismissed the complaint, holding that TikTok was immunized by § 230. The Third Circuit reversed.

The Third Circuit Ruling

The Third Circuit took a novel approach to interpreting § 230, concluding that when internet platforms use algorithms to curate and recommend content, they are engaging in “first-party speech,” essentially creating their own expressive content.

The court reached this conclusion largely based on the Supreme Court’s recent decision in Moody v. NetChoice (2024). In that case the Court held that an internet platform’s algorithm that reflects “editorial judgments” about content compilation is the platform’s own “expressive product,” protected by the First Amendment. The Third Circuit reasoned that if algorithms are first-party speech under the First Amendment, they must be first-party speech under § 230 too.

Here is the court’s reasoning:

230 immunizes [web sites] only to the extent that they are sued for “information provided by another information content provider.” In other words, [web sites] are immunized only if they are sued for someone else’s expressive activity or content (i.e., third-party speech), but they are not immunized if they are sued for their own expressive activity or content (i.e., first-party speech) . . .. Given the Supreme Court’s observations that platforms engage in protected first-party speech under the First Amendment when they curate compilations of others’ content via their expressive algorithms, it follows that doing so amounts to first-party speech under § 230. . . . TikTok’s algorithm, which recommended the Blackout Challenge to Nylah on her FYP, was TikTok’s own “expressive activity,” and thus its first-party speech.

Accordingly, TikTok was not protected under § 230, and Anderson’s case could proceed.

Whether the Third Circuit’s logic will be adopted by other courts (including the Supreme Court, as I discuss below), is an open question. The court’s reasoning assumes that the definition of “speech” should be consistent across First Amendment and CDA § 230 contexts. However, these are distinct legal frameworks with different purposes. The First Amendment protects freedom of expression from government interference. CDA § 230 provides liability protection for internet platforms regarding third-party content. Treating them as interchangeable may oversimplify the nuanced legal distinctions between them.

Implications for Online Platforms

If this ruling stands, platforms may need to reassess their content curation and targeted recommendation algorithms. The more a platform curates or recommends content, the more likely it is to lose § 230 protection for that activity. For now, this decision has opened the doors for more lawsuits in the Third Circuit against platforms based on their recommendation algorithms. If the holding is adopted by other courts it could lead to a fundamental rethinking of how social media platforms operate.

As a consequence, platforms might become hesitant to use sophisticated algorithms for fear of losing immunity, potentially resulting in a less curated, more chaotic online environment. This could, paradoxically, lead to more harmful content being visible, contrary to the court’s apparent intent.

Where To From Here?

Given the potential far-reaching consequences of this decision, it’s likely that TikTok will  seek en banc review by the full Third Circuit, and given the potential impact of the ruling there’s a strong case for full circuit review.

If unsuccessful there, this case is a strong candidate for Supreme Court review, since it creates a circuit split, diverging from § 230 interpretations in other jurisdictions. The Third Circuit even helpfully cites pre-Moody diverging opinions from the 1st, 2nd, 5th, 6th, 8th, 9th, and DC Circuits, essentially teeing it up for Supreme Court review.

In fact, all indications are that the Supreme Court would be receptive to an appeal in this case. The Court recently accepted the appeal of a case in which it would determine whether algorithm-based recommendations were protected under § 230. However, after hearing oral argument it decided the case on different grounds and didn’t reach the § 230 issue. Gonzalez v. Google (USSC May 18, 2023). Anderson presents another opportunity for the Supreme Court to weigh in on this issue.

In the meantime, platforms may start experimenting with different forms of content delivery that could potentially fall outside the court’s definition of curated recommendations. This could lead to innovative new approaches to content distribution, or it could result in less personalized, less engaging online experiences.

Conclusion

Anderson v. TikTok represents a potential paradigm shift in § 230 jurisprudence. While motivated by a tragic case, the legal reasoning employed could have sweeping consequences for online platforms, content moderation, and user-generated content. The decision raises fundamental questions about the nature of online platforms and the balance between protecting free expression online and holding platforms accountable for harmful content. As we move further into the age of AI curated feeds and content curation, these questions will only become more pressing.

Anderson v. TikTok, Inc. (3d Cir. Aug. 27, 2024)

For two earlier posts on this topic see: Section 230 Supreme Court Argument in Gonzalez v. Google: Keep An Eye on Justice Thomas and Supreme Court Will Decide Whether Google’s Algorithm-Based Recommendations are Protected Under Section 230

Posting Your New Job Info on Facebook Is Not “Soliciting” Former Employer’s Customers

It’s not often that a Massachusetts Superior Court decision gets national attention, but if you search for Invidia, LLC, v. DiFonzo (Mass. Super. Ct. Oct. 22, 2012) you’ll see that legal blogs around the country have picked-up on this obscure case.

Why? Because anything that involves the intersection of law and social media gets attention.

In this case, the issue that attracted attention was whether a hairdresser employed by a beauty salon in Sudbury, Mass. “solicited” her former employer’s customers in violation of a noncompete/non-solicitation agreement. What did Ms. DiFonzo do to trigger this claim? She posted news of her job change on her Facebook page. The court held neither posting news of her new salon, nor friending several customers, constituted solicitation.

Professor Eric Goldman has a lot to say about this case, including his question of how widespread litigation in the hair salon industry improves social welfare. And, he quite rightly gloats over the fact that an agreement like Ms. DiFonzo’s would not be enforceable in California, which has prohibited employee non-competes by statute.

Not noted by most commentators outside of Massachusetts is the judge’s tentative conclusion that the customer “good will” this hair dresser developed with her customers may belong to her, not her salon. Most of Ms. DiFonzo’s customers seem to have been developed by her while working at the old salon, which makes this holding somewhat unusual. Goodwill is usually found to belong to employees who bring customers with them to their job (in which case, they are allowed to leave with them).

Salon owners across the state must be pulling our their hair in frustration over this aspect of the decision.

The former employer’s motion for preliminary injunction was denied on multiple grounds. According to my count, its hairdressers 2, salon owners 0 this year.

Invidia, LLC, v. DiFonzo (Mass. Super. Ct. Oct. 22, 2012)