After years of allowing a baseless and dangerous conspiracy to live and grow to new heights on its platform, Facebook finally decided on Tuesday evening to ban QAnon and all related content from Facebook and Instagram. The decision will wipe away thousands of accounts, groups, and pages associated with the widespread delusion, but the company will still allow individuals to post QAnon related content on their personal accounts.
The move amounts to the perfect encapsulation of Facebook's inadequacies. The company willingly allowed QAnon to operate on its platforms uninterrupted for years, even driving people to the conspiracy-minded groups with its recommendation algorithms.
According to an investigation conducted by The Guardian, pages and groups dedicated to QAnon had more than three million followers as recently as June, and were growing quickly thanks to Facebook's vested interest in keeping people locked into the platform for as long as possible. For months, the company's algorithm pushed Q-obsessed groups to users who had liked or followed pro-Trump, anti-vaccine, and anti-lockdown groups and pages. The conspiracy started to permeate on wellness and yoga pages and other groups dedicated to natural and new age healing. It even started to suck in concerned mothers when QAnon members hijacked the #SaveOurChildren movement, spreading its asinine message that celebrities and Democratic politicians were participating in massive sex trafficking rings.
The collective delusion of QAnon, which has sucked in hundreds of thousands if not millions of people, started on the fringes. It began on 4chan and eventually moved to 8chan, a notorious site set up to be a bastion of free speech that largely devolved into a hub for hate, which also just so happens to have a significant audience of Trump supporters. Q could have stayed there, according to a Reply All investigation into the origins of the conspiracy theory, but those invested in the movement made a concerted effort to bring it into the mainstream. That included pushing it through conspiracy-friendly channels like InfoWars, setting up a more accessible platform for older and less tech-savvy followers on Reddit, and spreading the word on Facebook.
Compared to its fellow social networking companies, Reddit responded relatively quickly to QAnon's attempts to invade its space. After months and a number of violent threats hosted on the platform, Reddit took action and banned QAnon from the platform in 2018. That didn't entirely stop Q believers from posting across Reddit, particularly in subreddits like r/conspiracy, but today the platform is largely free of the extremists.
Had other companies taken a similar tact, QAnon could have been deprived of the air that it needed to breathe above the surface of the internet's underground. Instead, it thrived — particularly on Facebook, where the groundwork was already laid for conspiracists to find one another. Facebook did little to stop the spread of Pizzagate, a conspiracy that posited Hillary Clinton and other Democrats were operating a child sex trafficking ring out of a pizza parlor, which arguably served as the seed from which QAnon grew. It did little as far-right groups and users started to use the platform to widely spread misinformation about just about everything, from election security to coronavirus. It took an actual shooting, which left two people dead and one injured, for Facebook to take action against armed militia groups that organized on its platform. Even some of those actions have been ineffective, as reports have found extremist groups like the Boogaloo movement still thriving on the platform despite being banned and studies have found that white nationalist groups still operate largely uninterrupted on Facebook.
So Facebook has finally banned QAnon, but the armed and conspiracy-minded horse is already out of the barn. The conspiracists behind the movement used Facebook to push it for years and the company did little to intervene. The Wall Street Journal reported the conspiracy theory saw a surge in new followers in August across Facebook and Instagram, even as Facebook nibbled at the edges, purging some QAnon groups without cracking down on the entire movement.
At least now we finally know to what level of public awareness an extremist group needs to rise before Facebook will do anything about it. QAnon is considered a domestic terrorist threat by the FBI. Members of the group have carried out acts of violence in the name of the QAnon, including the murder of a Staten Island mob boss. A QAnon believer, Majorie Taylor Greene, will almost certainly be elected to Congress this November. Nearly half of all Americans have heard of QAnon, and a disturbing number of Republicans believe at least some of the conspiracy is true.
Facebook banned QAnon. It also made it what it is today, and is still allowing countless other hateful groups to flourish on its platform. The company doesn't deserve praise for cleaning up its own mess.