TikTok loophole allows users to post illegal child sex abuse material, advocate says
TikTok users are taking advantage of a private account login loophole to skirt the platform's rules and post illegal child sexual abuse material, an advocate has warned.
As Forbes reported earlier this month, TikTok users can post illegal child sex abuse material if they use a private posting login. Under this setting, the content is only visible to those with access to the private login information, escaping the notice of the site's moderators.
Seara Adair, a child sex abuse survivor and TikTok influencer, discovered the loophole in March after someone logged into a private TikTok account posted a video of a naked pre-teen doing "inappropriate things" and tagged Adair.
The influencer released a video drawing attention to the issue, landing in the feed of a sibling to an assistant U.S. attorney in Texas. Adair also informed the U.S. Department of Homeland Security. An agent told her in March that they were investigating the matter.
In a Friday statement to The Christian Post, a spokesperson for TikTok wrote that the platform has "zero tolerance" for child sex abuse material, noting that this "abhorrent behavior" is prohibited.
"When we become aware of any content, we immediately remove it, ban accounts, and make reports to NCMEC," the TikTok spokesperson said, referring to The National Center for Missing & Exploited Children, a child protection organization.
The DHS did not immediately respond to The Christian Post's request for comment.
"There's quite literally accounts that are full of child abuse and exploitation material on their platform, and it's slipping through their AI," Adair told Forbes. "Not only does it happen on their platform, but quite often it leads to other platforms — where it becomes even more dangerous."
A Forbes investigation found that many users are sharing their login credentials with others despite TikTok policy banning the practice.
"The sheer volume of post-in-private accounts that Forbes identified — and the frequency with which new ones pop up as quickly as old ones are banned — highlight a major blind spot where moderation is falling short and TikTok is struggling to enforce its own guidelines, despite a 'zero tolerance' policy for child sexual abuse material," Forbes found.
Lina Nealon, director of corporate and strategic initiatives for the National Center on Sexual Exploitation, told CP that child sex abuse material can "flourish" on the internet and emphasized the vulnerability of children.
"TikTok must confront this massive problem, especially given its popularity among children, and add sufficient resources to prioritize child protection," Nealon wrote in a Wednesday statement.
"No matter where children are on the internet, predators can and will find a way to reach them. What is happening on TikTok is not exclusive to that platform. Instagram, Snap, Discord, and others have vulnerabilities and parents must be aware and diligent," she continued.
"Tech platforms must also step up to better protect young users from harm and do all that they can do ensure safety."
The anti-sexual exploitation advocate believes parents can't bear the burden of protecting their kids alone. She urges U.S. Congress to hold tech platforms accountable by supporting solutions like the "Kids Online Safety Act" and "Children and Teens' Online Privacy Protection Act."
"Parents can help protect their children by ensuring that parental controls are turned on; by having regular age-appropriate conversations with their children; and by advocating for solutions to curb these issues of online sexual abuse," Nealon wrote.
NCOSE's website lists the ways predators can target children on TikTok and offers steps parents can take to shield their children from sexual exploitation.
After a meeting between TikTok and NCOSE officials last year, NCOSE says TikTok has implemented several safety features. The group had voiced concerns about how the platform allowed predators to comment on minors' posts or request sexual images via direct messages.
Since the meeting, TikTok disabled direct messaging for users under 16 and implemented parental control locks.
"While these changes are certainly encouraging, it remains to be seen how well new policies will be put into practice," NCOSE states on its website.
"We remain concerned about the extent of harmful content still accessible by young users — including advertising for pornography and prostitution sites — and believe there is still more TikTok could do to protect youth using their platform."
The DHS has investigated TikTok several times between 2019 and 2021 due to accusations of child sexual exploitation on the platform.
In September, Instagram suspended Pornhub's account. The pornography website's social media handle had 13 million followers and over 6,200 at the time of the suspension. While the account did not contain pornographic content, it did encourage users to become pornography performers.
Pornhub is also facing several lawsuits, including a $500 million class-action lawsuit against its parent company, MindGeek. The complaints are related to minors and other victims who accused the site of hosting sexually explicit material of them without their consent.
Last month, an Alabama mother filed a lawsuit claiming that Pornhub hosted content depicting a man sexually abusing her 12-year-old son. According to the filing, the assailant lived with the woman's family from July 2018 to October 2018, assaulting her 12-year-old and the other child during this period.
According to the lawsuit, Pornhub collaborated with the abuser to disseminate "obvious images and videos" of child molestation for profit. One such video depicting the abuse garnered over 188,000 views and over 1,100 subscribers for Pornhub. Another was up for sale on Pornhub for $15.
Samantha Kamman is a reporter for The Christian Post. She can be reached at: email@example.com.