(CNN Business) For years, YouTube has faced challenges over the types of content it removes, from terrorist propaganda and hate speech to misinformation about vaccines. It's now struggling over what to do about the QAnon conspiracy theory.
In an interview with CNN's Poppy Harlow for the Boss Files podcast, YouTube CEO Susan Wojcicki stopped short of pledging to ban QAnon followers on the platform.
"We're looking very closely at QAnon," Wojcicki said. "We already implemented a large number of different policies that have helped to maintain that in a responsible way."
Believers of the far-right conspiracy theory, which first appeared three years ago, have embraced a number of different and sometimes contradictory ideas. But the basic false beliefs underlying QAnon are claims about a cabal of politicians and A-list celebrities engaging in child sex abuse, and a "deep state" effort to undermine President Trump.
Wojcicki pointed to changes made to YouTube's recommendation system, which she said have reduced viewership of QAnon content by more than 80%. She said "a lot" of QAnon-related content would be classified in what YouTube calls "borderline content" -- which doesn't explicitly break its rules.
"We also have already removed a lot of it, in terms of hundreds of thousands of videos because it could violate other parts of our policies: hate, harassment, Covid information," Wojcicki said. "There's been quite a lot of videos that have been taken down or the views have been reduced."
Other social media platforms have become more aggressive in policing such content. Last week, Facebook announced a ban on any pages, groups, and Instagram accounts representing QAnon on its platform.
While Facebook was key for spreading QAnon ideas, the conspiracy theory has also thrived on YouTube.
Harlow asked Wojcicki about the hesitation to ban QAnon on YouTube, especially as the FBI has called followers a potential domestic terror threat. She did not provide a clear answer.
"I think with every policy, it has to be defined very clearly. Like what does that exactly mean, a QAnon group exactly?" Wojcicki said. "That's a kind of thing that we would need to put in terms of the policies and make sure that we were super clear. So we are continuing to evolve our policies here. It's not that we're not looking at it or we don't want to make changes.
"I think the way to approach it is by actually having the policies implemented in the right way. And our platform is very different from how Facebook works. And so I think each of us will take an approach that makes the most sense for our platforms," she added.
Harlow also asked Wojcicki about YouTube's plan for election night. Wojcicki repeatedly dodged questions on whether she makes the final call on decisions, such as taking down content from presidential candidates prematurely declaring victory. She declined to comment on hypothetical examples or scenarios.
"We have virtual war rooms to make sure that we have all the right people. We can make those decisions very quickly. We have people who are scouting the internet. We have an intelligence desk," she said. "We will have the full set of information and everyone ready to make sure that we are making the right calls."