Twitter sued after child sexual abuse video received over 167K views, 2K retweets

The Twitter logo is displayed on a screen on the floor of the New York Stock Exchange (NYSE) in New York City, U.S.
The Twitter logo is displayed on a screen on the floor of the New York Stock Exchange (NYSE) in New York City, U.S. | Reuters/Brendan Mcdermid

A federal lawsuit was filed against social media giant Twitter on Wednesday on behalf of a teenager, who claims that the company waited for days to remove sexually graphic videos of himself and another minor posted to its platform by sex traffickers. 

The lawsuit filed in the U.S. District Court of the Northern District of California is done so on behalf of the minor, who is identified in the legal filing as a 16-year-old Florida resident. Withholding the child’s real name for privacy concerns, the plaintiff is referenced in the filing by the name John Doe. 

Doe is being represented jointly by the National Center on Sexual Exploitation, the Haba Law Firm and the Matiasic Firm. 

According to the complaint, Doe was harmed by Twitter videos depicting his sexual abuse. The lawsuit accuses Twitter, which has over 300 million users, of refusing to remove child pornography videos depicting the minor’s sexual abuse when the company was notified by the plaintiff and his parents. 

“This lawsuit seeks to shine a light on how Twitter has enabled and profited from [child sexual abuse material] on its platform, choosing profits over people, money over the safety of children, and wealth at the expense of human freedom and human dignity,” the lawsuit reads. 

The filing states that Doe was solicited and recruited for sex trafficking as a minor. But after Doe escaped from the manipulation, child sexual abuse videos that he was featured in were disseminated on Twitter. 

According to NCOSE, Doe was horrified to find out that the videos made when he was 13 under the duress of sex traffickers were posted to Twitter. His parents reportedly contacted authorities and Twitter to ask that the posts be taken down immediately. 

NCOSE alleges that instead of removing the video content, Twitter reported back to the family that the material did not violate any of the company’s policies. 

As a result, videos featuring Doe amassed over 167,000 views and over 2,200 retweets before law enforcement involvement pressured Twitter to remove the videos in question. 

“When Twitter was first alerted to this fact and John Doe’s age, Twitter refused to remove the illegal material and instead continued to promote and profit from the sexual abuse of this child,” the lawsuit argues. 

Although Doe lives in Florida, the lawsuit was filed in California because that is where Twitter is headquartered. 

The lawsuit contends that Twitter permits “numerous profiles, posts, comments, and other content either advertising, soliciting, or depicting” child sexual abuse.

“As John Doe’s situation makes clear, Twitter is not committed to removing child sex abuse material from its platform,” NCOSE Senior Legal Counsel Peter Gentala said in a statement

“Even worse, Twitter contributes to and profits from the sexual exploitation of countless individuals because of its harmful practices and platform design. Despite its public expressions to the contrary, Twitter is swarming with uploaded child pornography and Twitter management does little or nothing to prevent it.”

A Twitter spokesperson told The Christian Post that the company will not offer comment on the lawsuit directly but stated that Twitter has "zero-tolerance for any material that features or promotes child sexual exploitation."

"We aggressively fight online child sexual abuse and have heavily invested in technology and tools to enforce our policy," the Twitter spokesperson wrote in an email. 

"Our dedicated teams work to stay ahead of bad-faith actors and to ensure we’re doing everything we can to remove content, facilitate investigations, and protect minors from harm — both on and offline."

The troubles for Doe began in 2017 when he engaged in dialogue with a user on Snapchat that he falsely believed went to his school. The exchange led to Doe sending naked photos of himself to that account.

“After he did so the correspondence changed to blackmail,” the court document states. “Now the Traffickers wanted more sexually graphic pictures and videos of John Doe, and recruited, enticed, threatened and solicited John Doe by telling him that if he did not provide this material, then the nude pictures of himself that he had already sent would be sent to his parents, coach, pastor, and others in his community.”

Doe initially complied with the trafficker’s demands and provided videos of himself engaging in sexual acts. The videos Doe sent featured him engaging in sexual acts with a minor friend. Doe later blocked the trafficker on Snapchat and communications eventually ceased. 

But in 2019, a compilation video of multiple videos sent by Doe to the traffickers surfaced on Twitter in posts published under the Twitter handles @StraightBross and @fitmalesblog. 

Twitter was alerted about the materials first by a concerned citizen in December 2019, according to the lawsuit. 

In January 2020, Doe was informed that students at his school had seen the videos. As a result, he faced teasing, harassment, vicious bullying and became suicidal. 

After both Doe and his mother sent complaints to Twitter, the lawsuit explains that Twitter responded on Jan. 28, 2020, stating that: “We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.”

The lawsuit contends that if Twitter had reviewed the material as it claimed, officials would have seen comments to the video that “acknowledge that the material was depicting minors.” Doe responded in shock that Twitter did not see any problem with the content. 

“Eventually, through a mutual contact, Jane Doe was able to connect with an agent of the U.S. Department of Homeland Security,” the lawsuit states. “The federal agent also initiated contact with Twitter and at the request of the U.S. federal government, the CSAM was finally removed from Twitter on or about January 30, 2020.”

Twitter was listed last February by NCOSE on the advocacy group’s annual Dirty Dozen List, which lists organizations that are complicit in “perpetuating” sexual exploitation in any form. 

Reports have suggested that as many as 10 million Twitter accounts may include sexually explicit content or other content deemed “not safe for work.” NCOSE has argued that Twitter has done “little to stem the overwhelming tide of sex trafficking, prostitution and pornography accounts on its site.” The organization warned that Twitter is used every day to advertise commercial sexual exploitation. 

“It has been documented by law enforcement that pimps and sex traffickers often either coerce trafficking or child sexual abuse victims into making such social media or advertising posts or create the posts themselves in their victim’s name,” the 2020 Dirty Dozen List explains. “This is what was found to happen on — the notorious classifieds ads website that was recently shut down by the Department of Justice for knowingly facilitating sex trafficking.”

Twitter maintains that it has clear rules in place to address non-consensual nudity, the sharing of private information and child sexual exploitation. Twitter publishes a transparency report, which details how many accounts have been removed. The vast majority of accounts are "detected through our internal tools and technology."

Was this article helpful?

Help keep The Christian Post free for everyone.

By making a recurring donation or a one-time donation of any amount, you're helping to keep CP's articles free and accessible for everyone.

We’re sorry to hear that.

Hope you’ll give us another try and check out some other articles. Return to homepage.

Free Religious Freedom Updates

Join thousands of others to get the FREEDOM POST newsletter for free, sent twice a week from The Christian Post.

Most Popular

Free Religious Freedom Updates

A religious liberty newsletter that is a must-read for people of faith.

More Articles