Why you should expect more Suez-like supply chain disruptions and shortages at your local grocery store
Shakespeare's musings on religion are like curious whispers – they require deep listening to be heard
Free speech on campus: universities need to create 'safe but critical' spaces for debate – here's how they can do it
Faith in numbers: Trump held steady among believers at the ballot – it was the nonreligious vote he lost in 2020
Post Office scandal reveals a hidden world of outsourced IT the government trusts but does not understand
Facebook ban of QAnon is only a first step in the battle against dangerous conspiracy theories
The decision by Facebook to remove any pages and groups associated with the far-right conspiracy theory movement known as QAnon will disrupt the ability of dangerous online communities to spread their radical messages, but it won’t stop them completely.
The announcement by Facebook on Oct. 6 to take down any “accounts representing QAnon, even if they contain no violent content,” followed earlier decisions by the social media platform to down-rank QAnon content in Facebook searches.
This month marks the third anniversary of the movement that started when someone known only as Q posted a series of conspiracy theories on the internet forum 4chan. Q warned of a deep state satanic ring of global elites involved in pedophilia and sex trafficking, and asserted that U.S. President Donald Trump was working on a secret plan to take them all down.
QAnon now a global phenomenon
Until this year, most people had never heard of QAnon. But over the course of 2020, the fringe movement has gained widespread traction domestically in the United States and internationally — including a number of Republican politicians who openly campaigned as Q supporters.
I have been researching QAnon for more than two years and its recent evolution has shocked even me.
What most people don’t realize is that QAnon in July and August was a different movement than what QAnon has become in October. I have never seen a movement evolve or radicalize as fast as QAnon — and it’s happening at a time when the socio-political environment globally is much different now than it was in the summer.
All of these factors came into play when Facebook decided to take action against “militarized social movements and QAnon.”
In the weeks leading up to the ban, I had seen a trend in more violent content on Facebook, especially with the circulation of memes and videos promoting “vehicle ramming attacks” with the slogan “all lives splatter” and other racist messages against Black people.
In explaining its ban, Facebook noted while it had “removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real world harm, including recent claims that the (U.S.) West Coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public.”
Prior action was ineffective
Prior to the outright ban, Facebook’s earlier attempts to disrupt QAnon groups from organizing on Facebook and Instagram were not enough to stop its fake messages from spreading.
One way Q supporters adapted was through lighter forms of propaganda — something I call Pastel QAnon. As a way to circumvent the initial Facebook sanctions, women who believe in the QAnon conspiracies were using warm and colourful images to spread QAnon theories through health and wellness communities and by infiltrating legitimate charitable campaigns against child trafficking.
QAnon followers have used softer messaging and female-focused imagery to infiltrate lifestyle social media communities — a tactic researcher Marc-André Argentino calls ‘Pastel QAnon.’ (Twitter), Author provided
The latest move by Facebook will still allow Pastel QAnon to exist in adjacent lifestyle, health and fitness communities — a softening of the traditionally raw QAnon narratives, but an effective way to spread the conspiracies to new audiences.
Some QAnon pages have survived ban
Facebook will certainly be monitoring any attempts by the QAnon community to circumvent the ban. And while Facebook’s action reduced the number of QAnon accounts, it didn’t eliminate them completely — and realistically will not. My research shows the following:
QAnon public groups pre-ban 186; post-ban 18.
QAnon public pages pre-ban 253; post-ban 66.
Instagram accounts pre-ban 269; post-ban 111.
Facebook’s actions will do permanent damage to the presence of QAnon on the platform in the long run. Short and medium term, what we will see are pages and groups reforming and trying to game the Facebook algorithm to see if they can avoid detection.
However, with little presence on Facebook to quickly amplify new pages and groups and the changes to the search algorithm, this will not be as effective as it was in the past.
Where will QAnon followers turn if Facebook is no longer the most effective way to spread its theories? Already, QAnon has further fragmented into communities on Telegram, Parler, MeWe and Gab. These alternative social media platforms are not as effective for promoting content or merchandise, which will impact grifters who were profiting from QAnon, as well as limit the reach of proselytizers.
But the ban will push those already convinced by QAnon onto platforms where they will interact with more extreme content they may not have found on Facebook. This will radicalize some individuals more than they already are or will accelerate the process for others who may have already been on this path.
Like a religious movement
What we will likely see eventually is the balkanisation of the QAnon ideology. It will be important to start considering that QAnon is more than a conspiracy theory, but closer to a new religious movement. It will also be important to consider how QAnon has be able to absorb, co-opt or adapt itself to other ideologies.
Though Facebook has taken this important step, there will be much work ahead to make sure QAnon doesn’t reappear on the platform.
Also, a bigger issue is that YouTube has yet to take action against QAnon. Video is the most used medium to circulate QAnon content across digital ecosystems. As long as QAnon still has a home on YouTube, we will continue to see their content on all social media platforms, and there is not much Facebook alone can do about that. QAnon will ultimately require a multi-platform effort.
Technology and platforms provide a vector for extremist movements like QAnon. However, at its root, it’s a human issue and the current socio-political environment around the world is fertile for the continued existence and growth of QAnon.
Facebook’s move is a step in the right direction, but this is not the end game. There is much work ahead for those working in this space.
Marc-André Argentino receives funding from Concordia University. Marc-André Argentino is affiliated with the Global Network on Extremism & Technology, Institute for Strategic Dialogue, and le Centre d'expertise et de formation sur les intégrismes religieux, les idéologies politiques et la radicalisation