Under Elon Musk, Twitter has nearly doubled its suspension of accounts posting images and videos exploiting child sexual abuse, according to a cybersecurity expert. Yet, one advocacy group warns that while CSAM hashtags have been removed, such content is still easily found.
Andrea Stroppa of Ghost Data, who has been working alongside Twitter’s Trust and Safety team, tweeted a thread on Saturday providing updates on the effort to combat child sexual exploitation material.
“Twitter updated its mechanism to detect content related to child sexual abuse/exploitation material. Faster, more efficient, and more aggressive. No mercy for those who are involved in these illegal activities,” tweeted Stroppa. “The daily suspension rate has ALMOST DOUBLED over the past few days. It means that Twitter is doing a capillary analysis of contents, especially those published in the past. It doesn't matter when illicit content has been published. Twitter will find it and act accordingly.”
Stroppa added that over a recent 24-hour period, Twitter “took down 44,000 suspicious accounts, including over 1,300 profiles that tried to bypass detection using codewords and text in images to communicate. Zero tolerance.”
Elon Musk, the billionaire who now owns Twitter, weighed in on the update by posting a reply tweet saying, “Thank you for helping with this important problem.”
Lina Nealon, director of corporate and strategic initiatives for the National Center on Sexual Exploitation, told The Christian Post that while it was “a step in the right direction,” she felt there was still more to be done.
“Any effort Twitter is taking to combat child sexual abuse material and other illegal content on the platform is a step in the right direction. At the very least, [Elon Musk] is helping to shine light on child sexual abuse, a massive, destructive problem in our society,” said Nealon. “However, addressing the bare minimum and making some ethical decisions does not a hero make. It’s easy to say that child protection is a top priority, but Twitter’s actions point to profit being the top priority.”
Describing Twitter as having long been “the front porch for predators on the internet,” Nealon told CP that she believes “small improvements have been made.” Yet, she added, “other decisions” have “undercut the effectiveness of those changes.”
“For instance, some known child sexual abuse material [CSAM] hashtags have been removed, but that content is still easily found,” Nealon explained. “Twitter also added back the CSAM reporting features, but simultaneously gutted the departments charged with investigating and addressing those reports.”
Last month, NCOSE expressed concern about Twitter’s reported plans to launch a platform that would compete with OnlyFans, which the group believes will lead to more sexual exploitative content.
Earlier this year, NCOE placed Twitter on its “Dirty Dozen List” of businesses and other entities that promote sexually exploitative material, namely for the site’s failure to take down accounts dedicated to promoting commercial sex and failure to listen to victims of sexual exploitation.
In January 2021, NCOSE even filed a lawsuit against Twitter, claiming that the site refused to remove sexually graphic videos of an unnamed teenager and another anonymous minor that were posted by sex traffickers.
“This lawsuit seeks to shine a light on how Twitter has enabled and profited from [child sexual abuse material] on its platform, choosing profits over people, money over the safety of children, and wealth at the expense of human freedom and human dignity,” stated the lawsuit.