Social media company Twitter has cited the recent protests at the United States' Capitol building in its decision to permanently suspend tens of thousands of user accounts.
On January 6, protestors forced their way into the Capitol building, interrupting a Joint Session of Congress in which the results of the 2020 US presidential election were being certified. Five people died in the violent assault.
In a blog post uploaded late on Monday night, Twitter announced that it had suspended 70,000 user accounts associated with the QAnon movement.
“Given the violent events in Washington, DC, and increased risk of harm, we began permanently suspending thousands of accounts that were primarily dedicated to sharing QAnon content on Friday afternoon,” stated Twitter.
“These accounts were engaged in sharing harmful QAnon-associated content at scale and were primarily dedicated to the propagation of this conspiracy theory across the service."
Among the circulated images of individuals who stormed the Capitol was a photo of Jake Angeli, a well-known supporter of QAnon who refers to himself as the QAnon Shaman.
QAnon followers hold a series of beliefs widely discredited as conspiracy theories, one of which is that President Donald Trump is fighting against a group of prominent Democrats, Hollywood elites, and "deep state" allies who engage in the sexual abuse of children.
The movement began in 2017 with some posts on the message board 4chan by an individual who signed themselves "Q."
In August 2020, Facebook removed or restricted over 10,000 groups, pages, and accounts across the social network and Instagram linked to QAnon as part of a move to crack down on "militia organizations and those encouraging riots, including some who may identify as Antifa.”
YouTube has removed tens of thousands of QAnon videos. In October 2020, the company announced that it had expanded its hate and harassment policies to prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence.
"One example would be content that threatens or harasses someone by suggesting they are complicit in one of these harmful conspiracies, such as QAnon or Pizzagate," said YouTube.