Not long ago, the concept of certain social media accounts and content being “shadow banned” was ridiculed as a hoax. But those days are over for good.
In the name of “insightful transparency,” Facebook on September 23 unveiled its Content Distribution Guidelines, a partial listing of the types of content it limits from visibility on its platform or “demotes” in users’ news feeds. To be clear, these aren’t new protocols—Facebook is just revealing to the public some of the standards governing its content suppression practices.
In a blog post, Anna Stepanov, Director of Product Management at Facebook, connected these protocols to three of Facebook’s corporate values: responding to direct feedback, incentivizing investment in high-quality content, and fostering an environment where users feel “safer” from “problematic” content.
“Some content may be problematic for our community, regardless of the intent,” Stepanov wrote. “We’ll make this content more difficult for people to encounter.”
“Types Of Content We Demote” includes such things as:
- Comments that Facebook predicts will be reported because comments like them have been reported before.
- Content that Facebook’s third-party fact-checkers have “debunked” as "False, Altered, or Partly False.”
- Content from news publishers that Facebook users “broadly” rate as “untrusted” in on-platform surveys.
- “Unoriginal” news articles lacking additional facts and analysis.
- Content “borderline to” or “likely violating” Facebook’s community standards.
- Besides violent and sexual imagery, this bucket includes content that could discourage COVID-19 vaccinations, or content posted by groups and pages “associated with (but not representing)” certain “conspiracy networks.”
What if your organization has sound editorial standards, wholesome audience engagement, and isn’t associated with any “conspiracy networks,” but social media analytics suggest your reach has been diminished nonetheless? Based on September 23’s big reveal, simply playing by the rules—as much as any group speaking about religious beliefs can, given the broad and ideologically-informed nature of Facebook content standards—won’t protect your ministry or media company from content suppression on Facebook.
Instead, you’re at the mercy of the users, but not just those who “Like” your page and appreciate your messages. If hostile users flag and report your content because of religious or ideological viewpoint objections, this recent update confirms that Facebook’s stated corporate value of “responding to people’s direct feedback” may just win out, resulting in “demoted” posts and limited visibility.
Furthermore, if your organization posts any information about COVID-19 recovery or treatments, even from the perspective of a personal experience or testimony, you may run afoul of the new guidelines if Facebook deems the information “sensationalized.” (The White House revealed in July 2021 that the federal government was assisting Facebook in identifying COVID-19 misinformation.)
And that’s just the part we know. The true extent of Facebook’s content distribution policies remains cloaked in secrecy. However, the platform’s history of suppressing Christian views on sexual orientation and gender identity, abortion, and religious liberty may provide clues.
Enforcing “safe” conversation through algorithmic ideological judgments is a step toward a less humane world with less critical thought. The elevation of “safety” as a principle governing public expression will degrade the quality of public discourse, preclude countless challenging and worthwhile conversations, and silence messages with the power to transform hearts and minds.
NRB opposes religious viewpoint discrimination in every form, including content moderation policies that limit the ability to teach the Bible, preach the Gospel, and promote Christian values in the public square. We are wary that Facebook’s commitment to a “safe” experience may amount to an open season on religious viewpoints while doing nothing to improve the quality of online discourse. And we believe that mob rule and guilt by association are two very bad ways to “improve” the conversation.
It’s key that organizations proactively defend against these threats as well. One of the most important steps for a ministry or media company to take right now is to back up your content like videos, content lists, and more away from the distribution platforms themselves. Don’t risk losing access to your content and messages by storing them solely on user accounts that can be suspended, locked, or removed. Ensure that your audience can access your digital media on more than one platform. Evaluate threats to your organization’s digital infrastructure before they happen. Finally, develop a plan to get the word out if Big Tech de-platforming strikes you. If this happens, communicate with us at NRB about any de-platforming actions taken against your organization.
NRB has monitored threats to religious freedom on new media platforms for over a decade. Today, as since its founding in 1944, NRB is committed to representing Christian broadcasting wherever threats to religious free speech emerge.
Troy A. Miller serves as the CEO of NRB. A senior executive with more than 30 years of management and business experience, Troy served for six years with Coral Ridge Ministries, three of those as the executive vice president and chief operating officer, focusing on strategic direction and planning. Previously, Troy spent 10 years with Gateway leading a number of business startups, including Gateway’s expansion into Europe and Asia, new manufacturing facilities, and global information technology application strategy. He has spoken at seminars on strategic business planning, information technology integration, organizational development, and Christian apologetics, and has spent time teaching pastors in the Far East.
Noelle Garnier is the policy strategist at NRB.