Rick Loomis/Getty Images
Summary List PlacementOn Wednesday morning, Facebook updated its ban on QAnon content, saying it would redirect people searching for the conspiracy theory "to resources that can help inform them of the realities of QAnon and its ties to violence and real world harm."
But less than two hours after implementing the redirect, the company had to hit pause after a "glitch" caused people to see QAnon information even when they weren't looking for it.
"When we first launched the Redirect Initiative for QAnon today there was a glitch that caused people to see information about this topic when they searched for unrelated terms. We've paused this Redirect while we fix the issue," Facebook said in a tweet Wednesday.
Facebook did not respond to questions about when and how the redirect would be re-implemented.
Facebook's update planned to send people to resources from the Global Network on Extremism and Technology, the academic research arm of the Global Internet Forum to Counter Terrorism, an initiative created and funded by Facebook in partnership with other social media giants such as YouTube, Twitter, and Microsoft, which owns LinkedIn.
"As we continue to study the impact of our enforcement against QAnon, we'll partner with GNET to assess the impact of this Redirect Initiative, and we'll continue to reassess the list of terms that, when searched for on our platform, should direct people to these resources," Facebook said in its blog post on the update earlier Wednesday.
Facebook has taken various steps to crack down on QAnon, the unfounded far-right conspiracy theory — which holds that a cabal of Satan-worshiping, child-trafficking Democrats is plotting to oust President Donald Trump — following pressure from various groups including users, employees, advertisers, misinformation experts, and lawmakers.
The company has been criticized for its slowness in acting against QAnon, only announcing earlier this month that it would remove all pages, groups, and Instagram accounts that promoted QAnon. The ban, which Facebook said would be enacted gradually, comes after the platform announced over the summer that it had removed 790 QAnon Facebook groups.
BuzzFeed News also reported Tuesday that CEO Mark Zuckerberg plans to roll back many of the company's steps aimed at slowing the spread of misinformation following the upcoming elections.
Read more: How the GOP learned to love QAnon
Extremism researchers are tracking how the new ban will play out, as the movement has spread rapidly on Facebook and on Instagram, where many are using "Save the Children" rhetoric in an attempt to further spread the movement's misguided focused on human trafficking conspiracy theories.
QAnon has also grown increasingly mainstream among the GOP base, fueled by Trump repeatedly refusing to denounce the conspiracy theory, questioning in a recent town hall whether it was a conspiracy theory, and claiming its adherents were fighting pedophilia.
NOW WATCH: What makes 'Parasite' so shocking is the twist that happens in a 10-minute sequence
See Also:
- QAnon Halloween costumes are still available on Amazon after other companies banned goods related to the conspiracy theory movement
- This is how 15 tech companies have reacted to the QAnon conspiracy theory on their platform
- Facebook has blocked 2.2 million ads for breaking its rules on elections ahead of the November vote