Editor's Note: (Ethan Zuckerman is an associate professor of public policy, information and communication at the University of Massachusetts at Amherst and director of the Institute for Digital Public Infrastructure. He is the author of the new book "Mistrust: Why Losing Faith in Institutions Provides the Tools to Transform Them." The views expressed in this commentary are his own. View more opinion at CNN.)
(CNN) On January 8, President Donald Trump was kicked off his soapbox: He lost his Twitter account. After suspending Trump for videos that praised Capitol riot participants, Twitter concluded that his follow-up tweets might incite future violence and it permanently banned him. Facebook has also suspended the President's account.
These actions reflect a major shift in how large social media platforms handle extreme speech. They have long tolerated right-wing extremists and were particularly hands off with Trump, concluding that his tweets and posts -- for all his flagrant lying and bullying -- were important dispatches from a world leader, and had to stay up for everyone in our democracy to see.
But over time, lines have been drawn.
The conspiracy theorist Alex Jones was deplatformed in 2018, helping start a migration of right-wing social media onto parallel platforms that have included Parler and others. It's possible that Trump, now sidelined and silent online, may reemerge on another platform, making it more influential... and subject to more public scrutiny. Parler is offline now because Amazon removed hosting services due to concerns over unmoderated violent content.
But incitement to violence and hate aren't the only reasons platforms are removing extreme speech. Last spring, we saw coordinated action among the main tech platforms to remove the "Plandemic" video, in which a discredited scientist railed against Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases, and warned against mask wearing.
Because the disinformation in the video was likely to exacerbate the Covid pandemic, platforms quickly took down the video, limiting its spread.
We are indeed seeing a shift by the platforms away from their previous free-speech absolutism toward an understanding of speech moderation as a matter of public health and safety: Sharing disinformation about Covid makes it harder to fight the disease; similarly, sharing disinformation about voting weakens our democracy.
Under the paradigm of the public good, it makes much more sense for platforms to control speech more assertively.
But the actions by Twitter and other companies are being falsely labeled censorship. This is untrue: Trump isn't being censored. None of these platforms have an obligation to provide a forum to anyone. They are all private companies with terms of service that users agree to.
Trump, like everyone else on Twitter, agreed to its rules, which include a ban on promoting violence. This has nothing to do with the First Amendment, which applies to government restrictions on speech, not contracts between private companies and their users.
Furthermore, Trump still has vast powers to communicate. He has a press secretary, and a press briefing room inside his house. He can talk to reporters -- not to mention the country and the world -- whenever he wants. This is not a man who has a problem being heard.
If Trump is determined to speak online, he has other options. Many of the millions of subscribers Trump has lost on Twitter would likely find him on a new platform. And the press will surely continue to cover his rantings in a new space, even though we all might benefit from paying less attention to Trump once he leaves the White House.
All that said, the power over speech enjoyed by platforms like Facebook, Twitter and YouTube is still deeply problematic. The answer is not as simple as more regulation that shifts control from platform owners to government. A much better shift would be toward a new vision of social media -- as public infrastructure that is built, owned and moderated by communities.
This is how speech works in the offline world. If I say something violent or offensive within a group of friends, that group has to figure out how to respond: do they confront me to make clear that such speech is unacceptable? Do they kick me out of the group? Or do they decide it's better to ignore or tolerate what I say?
These are not easy discussions, whether in the workplace, church, or community organizations, but they are the bedrock of small "d" democratic life.
The platforms have done us a real disservice in taking these discussions out of our hands and handing them over to algorithms and low-paid moderators overseas. We need to debate what sort of public speech and behavior we will accept and who we want to be as a society. A society that abandons this responsibility abdicates a key aspect of full citizenship.
The events of January 6 were heartbreaking and shocking, if not surprising. It is understandable that many are celebrating Trump's social media muzzling. The satisfaction people are feeling comes from seeing that actions have consequences, something that is, unfortunately, easier to demonstrate within corporate America than within the halls of Washington.
If you are afraid that Trump's removal from Twitter and Facebook reflect the disproportionate power those platforms have over public life, you're not wrong. We have allowed our public sphere to be controlled by a small number of private companies, and we are now discovering how vulnerable online speech can be.
The silencing of Trump was justifiable under the circumstances -- an active insurrection spurred on by the President -- but it still should lead us to sustained conversation about how we want our digital public spaces to operate, and what the rules should be.