What we cover here
Watch this space for regular updates leading up to the election, and through any turmoil after.
Watch this space for regular updates leading up to the election, and through any turmoil after.
We created Misinformation Watch to provide CNN readers a destination for misinformation-related U.S. election coverage. With the election process having come to a close, this is our last post on Misinformation Watch. It is far from the end of our coverage of misinformation, though.
The theory moved through well trod channels of misinformation -- QAnon groups, conspiratorial hashtags, fringe YouTube personalities, the former President’s social media accounts -- growing more complex as it spread. Eventually, the theory was so varied and multi-pronged that it was all but impossible to disprove to those who subscribed to it. It spilled out into the physical world and, eventually, its adherents spilled blood.
We watched, in real time, both the birth and the ultimate destructive power of that false reality. It was the perfect storm of misinformation, and it was another warning.
Misinformation is not simply the result of conspiratorial internet posters or grifters sowing fear to make a buck. Though it thrives under tech giants’ uneven content moderation policies, it will not be stamped out solely by more robust self-policing by these platforms.
Misinformation’s impact has as much to do with the ways in which we are served online content as the content itself. It is a consequence of how social media and internet companies are built and how they profit. A company that can both collect an unfathomable amount of information about a person – where they live, who they love, if they are happy, if they have a job, what secret questions they ask, if they might be interested in buying a bulletproof vest – is able to target content and advertisements at them precisely.
When that company’s ultimate goal is simply to keep that person scrolling, to keep them returning to the platform, then the alternative realities we have seen emerge are inevitable.
In the wake of the Capitol insurrection, panicked tech titans took broad action against purveyors of lies and conspiracy, ousting tens of thousands of accounts, including ones belonging to then-President Donald Trump. It was a moment of real change.
Misinformation and conspiracy communities that were pushed out of mainstream homes like Facebook and Twitter found safe haven on Parler, which was, in turn, pushed off its web hosting service.
“Stop the Steal” conspiracy theorists, QAnon believers, and other fringe communities scattered and splintered as they sought new homes online. The impact of this exodus for the internet and the rest of the US is yet to be known.
While this project has come to an end, we’ll continue to cover both the origins and impact of misinformation. It is a topic of vital importance and we’re committed to covering it. Thanks for reading, and please keep coming back for more of our coverage of the issue — we’ve got a lot more in mind.
But as Biden raised his hand and swore an oath to defend the Constitution, becoming the nation's 46th president — nothing happened.
Facebook said Tuesday that since August it has removed about 18,300 Facebook profiles and 27,300 accounts on Facebook-owned Instagram for violating its policies against QAnon. The company has also removed 10,500 groups and 510 events for the same reason.
“As of January 12, 2021, we have identified over 890 militarized social movements to date," the company said.
Last week, Twitter announced it had suspended more than 70,000 accounts for promoting QAnon.
Despite crackdowns, the conspiracy theory continues to spread on Twitter, Facebook and fringe social platforms, according to new research from nonpartisan nonprofit Advance Democracy.
Over the holiday weekend, more than 1,280 accounts related to QAnon posted on Twitter about 67,000 times, peddling conspiracy theories about the election and President-Elect Joe Biden, according to the research.
For example, one QAnon account shared a 45-second video rife with false claims about election fraud. The video racked up about 360,000 views. After CNN Business flagged the video, Twitter took down the account for violating its rules against ban evasion.
Less than 24 hours before Joe Biden is set to become president, Facebook continues to show ads for tactical gear despite vowing to ban those promotions ahead of the inauguration.
A review by CNN and other internet users this week showed that ads for body armor, holsters and other equipment were being displayed on the platform as late as Tuesday afternoon.
Often, the advertised products are pictured alongside guns, ammunition, or people clad in camouflage fatigues.
The ads have frequently appeared in the timelines of military veterans and contribute to a false narrative of an imminent violent conflict in the United States, according to Kristofer Goldsmith, founder and president of High Ground Veterans Advocacy.
“They’re selling the idea of pending violence, or inevitable violence, and that’s the kind of thing that becomes a self-fulfilling prophecy,” said Goldsmith.
In one example still on Facebook Tuesday afternoon, a pair of noise-reducing earbuds was being advertised as a form of active hearing protection, shown inserted in the ears of a gunman aiming down his rifle sights.
Another ad, for body armor, promises consumers that the product can shield them from bullets, knives, stun guns and other threats.
A third series of ads, for hard-knuckled gloves, showed a man wearing desert camouflage and a tactical rig performing various tests on the gloves, including punching concrete walls, breaking a glass bottle by hand and rubbing broken glass on the gloves’ palms.
“They put people in combat gear in a civilian setting,” Goldsmith said of the ads. "They’re promoting this image of, ‘You need to get ready for combat.’”
"We already prohibit ads for weapons, ammunition and weapon enhancements like silencers," Facebook said in the blog post. "But we will now also prohibit ads for accessories such as gun safes, vests and gun holsters in the US."
Facebook appears to have removed some of the advertisements CNN found, including a series of ads for armored plates and plate carriers. The plates had, in some cases, been shown being held by heavily muscular individuals dressed in fatigues or being inserted into camouflage-patterned backpacks. Despite having seemingly removed some of the advertisers' ads, Facebook has allowed other ads for the same products, by the same advertisers, to persist on the platform.
Another now-removed series of body armor ads included marketing copy that claimed specific levels of protection under the rubric established by the National Institute of Justice.
Veterans are a popular target for misinformation and conspiracy theorists, Goldsmith said, because as a group they enjoy political and social authority. An endorsement by a veteran can reinforce a conspiracy theory's apparent credibility.
“If you change the mind of a veteran, there’s a good chance you change the minds of those within that veteran’s immediate circle — friends, family, coworkers,” said Goldsmith.
Groups and individuals spreading lies about the 2020 election and calling to protest the outcome have continued to hide in plain sight on Facebook, even as COO Sheryl Sandberg this week tried to downplay the platform's role in the Capitol riots.
From altering the names of their online forums to abusing the core features of Facebook's own services, conspiracy theorists have worked to evade content moderators despite the company's vows of a crackdown, new research shows.
These groups' efforts to remain undetected highlight the sophisticated threat confronting Facebook, despite its insistence the situation has been less of a problem compared to on other platforms. It also raises new concerns that the groups' persistence on these mainstream social networks could spark a new cycle of violence that stretches well into Joe Biden's presidency.
The latest examples surfaced on Thursday, as extremism experts at the activist group Avaaz identified 90 public and private Facebook groups that have continued to circulate baseless myths about the election, with 166,000 total members.
Of those, a half-dozen groups appeared to have successfully evaded Facebook's restrictions on "stop the steal" content, according to Avaaz. Though many initially had "stop the steal" in their names, the groups have since altered their profiles, according to page histories reviewed by CNN Business — allowing them to blend in with other Facebook activity.
"So instead of 'Stop the Steal,' they became 'Stop the Fraud' or 'Stop the Rigged Election' or 'Own the Vote,'" said Fadi Quran, campaign director at Avaaz.
YouTube is working with top health organizations to create authoritative medical videos for the platform in an effort to crackdown on Covid-19 misinformation.
The new health partnership team will be headed by Dr. Garth Graham, YouTube’s new director and global head of healthcare and public health partnerships. Graham was most recently the chief community health officer at CVS Health.
Like other tech platforms, YouTube has had to tackle the spread of misinformation about Covid-19.
The messaging app Zello said it has removed more than 2,000 channels on its platform related to armed extremism, and banned all “militia-related channels,” after it found evidence that some of its users participated in the Capitol riots.
“It is with deep sadness and anger that we have discovered evidence of Zello being misused by some individuals while storming the United States Capitol building last week,” the company said. “Looking ahead, we are concerned that Zello could be misused by groups who have threatened to organize additional potentially violent protests and disrupt the U.S. Presidential Inauguration Festivities on January 20th.”
Zello added that “a large proportion” of the channels it removed on Wednesday had been dormant for months and in some cases years.
The company is further analyzing the groups on its platform to determine whether any may violate its terms of service. But it added that because it does not store message content, the task is not as simple as running searches for keywords or hashtags and blocking them.
The messaging app Telegram is battling an increase in violent extremism on its platform amid a surge in new users, the company acknowledged to CNN Wednesday.
In the last 24 hours, the company has shut down "dozens" of public forums that it said in a statement had posted "calls to violence for thousands of subscribers."
But the effort has turned into a game of cat and mouse, as many of the forum's users set up copycats just as soon as their old haunts were disabled. Screenshots and Telegram groups monitored by CNN show that a number of channels containing white supremacy, hate and other extremism have been shut down, but that at least some have been replaced by new channels. And at least one meta-channel has emerged that maintains lists of deactivated groups and that redirects visitors to the replacements. One now-defunct group that CNN reviewed had more than 10,000 members.
"Our moderators are reviewing an increased number of reports related to public posts with calls to violence, which are expressly forbidden by our Terms of Service," Telegram spokesperson Remi Vaughn told CNN. "In the past 24 hours we have blocked dozens of public channels that posted calls to violence for thousands of subscribers."
Vaughn added: "Telegram uses a consistent approach to protests and political debate across the globe, from Iran and Belarus to Thailand and Hong Kong. We welcome peaceful discussion and peaceful protests, but routinely remove publicly available content that contains direct calls to violence."
Telegram has surpassed half a billion active users worldwide. The company announced Tuesday that it had grown by 25 million users over the past several days -- with about 3 percent of that growth, or 750,000 new signups, occurring in the United States alone, Telegram told CNN.
Apps such as Telegram, Signal and MeWe have experienced explosive growth in recent days after WhatsApp sent a notification to its users reminding them that it shares user data with its parent, Facebook -- and following the suspension of President Donald Trump and the alternative social network Parler from many major tech platforms.
One of the people who has been reporting violent channels to Telegram is Gwen Snyder, a Philadelphia-based activist who said she has been monitoring far-right extremists on the platform since 2019. Earlier this week, as Telegram was witnessing a surge in new users, Snyder enacted a plan to organize mass pressure against Telegram’s content moderators.
“We started two days ago calling for Apple and Google to deplatform Telegram if they refused to enforce their terms of service,” Snyder told CNN. “We had dozens if not hundreds of relatively large-follower Twitter accounts amplifying the campaign.”
It’s difficult to determine whether Telegram’s actions may have been a direct result of the activism; Snyder said she never heard from Telegram or from Apple or Google, either.
But at least some of the Telegram channels affected by the crackdown appeared to believe that Snyder’s efforts were responsible — and soon began posting her personal information online and targeting her with death threats.
“That’s my home address,” Snyder said in a public tweet, attaching a redacted screenshot of an extremist Telegram channel that had shared her information. Addressing Telegram, she added: “You're okay with this? ENFORCE YOUR OWN TERMS OF SERVICE.”