Recommended

Supreme Court rejects child pornography case against Reddit

This photo illustration shows the logo of Reddit on a mobile phone in Arlington, Virginia on January 29, 2021.
This photo illustration shows the logo of Reddit on a mobile phone in Arlington, Virginia on January 29, 2021. | AFP via Getty Images/ Oliver Douliery

The U.S. Supreme Court has declined to hear an appeal filed by a group of women suing Reddit and alleging the online platform is profiting from the posting of child pornography, allowing an earlier decision in favor of the popular website to stand.

In an orders list released Tuesday morning, the nation's high court refused to grant a writ of certiorari and hear oral arguments in the case of Jane Does 1-6 et al. v. Reddit, Inc. The case centers on whether the social media platform knowingly participated in sex trafficking.

In April 2021, a woman identified as "Jane Doe" filed a class action lawsuit against Reddit after her ex-boyfriend posted sexually explicit images of her from when she was a minor on the platform without her consent.

Get Our Latest News for FREE

Subscribe to get daily/weekly email with the top stories (plus special offers!) from The Christian Post. Be the first to know.

Although Doe repeatedly reported these posts to Reddit, the website allegedly did little to help take down the images, with her ex-boyfriend continuously reposting them to the platform.

"Because Reddit refused to help, it fell to Jane Doe to monitor no less than 36 subreddits — that she knows of — which Reddit allowed her ex-boyfriend to repeatedly use to repeatedly post child pornography," read the lawsuit, as reported by Courthouse News Service.

"This is despite the fact that, as Reddit well knew, throughout this time her ex-boyfriend uploaded the content from the identical IP address."

The district court ruled against the women and their families, arguing that Section 230 of the 1996 Communications Decency Act, a federal law that protects websites from liability for content users publish on their platforms that might be defamatory, protected Reddit.

Last October, a three-judge panel of the 9th U.S. Circuit Court of Appeals unanimously ruled in favor of Reddit, with Circuit Judge Milan D. Smith Jr. authoring the opinion.

"Moreover, the plaintiffs have not alleged a connection between the child pornography posted on Reddit and the revenue Reddit generates, other than the fact that Reddit makes money from advertising on all popular subreddits," wrote Smith.

"Plaintiffs who have successfully alleged beneficiary liability for sex trafficking have charged defendants with far more active forms of participation than the plaintiffs allege here."

Earlier this year, the National Center on Sexual Exploitation placed Reddit on its "Dirty Dozen List" of businesses that profit or enable sexually exploitative practices, arguing that the site is "a hub of image-based sexual abuse, hardcore pornography, prostitution — which very likely includes sex trafficking — and overt cases of child sexual exploitation and child sexual abuse material."

While Reddit has been championing a new policy against sharing explicit material like revenge porn and better efforts to remove reported content, NCOSE argues that "these policies on paper are not translating into proactive prevention or removal of abuse in practice."

In 2021, NCOSE warned that Reddit had become a "hub of exploitation where sex buyers and other sexual predators meet to exchange non-consensually shared intimate images, hardcore pornography, and to give advice to each other about how to use and abuse."

Reddit isn't the only social media platform sued over users posting child pornography. Earlier this month, the 9th Circuit dismissed a lawsuit claiming Twitter profited from sex trafficking by allowing tweets containing sexually explicit images of two 13-year-old boys. 

The plaintiffs sued Twitter in 2021. One of the plaintiffs said he was recruited for sex trafficking as a minor but escaped from the manipulation. However, the child sexual abuse videos he was featured in were disseminated on Twitter.

His parents reportedly contacted authorities and Twitter to ask that the posts be taken down immediately. The lawsuit contends that Twitter told the family that the material didn't violate the company's policies. It wasn't until law enforcement got involved that Twitter removed the videos, but not before they amassed over 167,000 views and over 2,200 retweets.

The 9th Circuit also cited Section 230 of the Communications Decency Act to rule in favor of Twitter. 

Follow Michael Gryboski on Twitter or Facebook

Was this article helpful?

Help keep The Christian Post free for everyone.

By making a recurring donation or a one-time donation of any amount, you're helping to keep CP's articles free and accessible for everyone.

We’re sorry to hear that.

Hope you’ll give us another try and check out some other articles. Return to homepage.