Stay Updated on Developing Stories

What you need to know about the board deciding Trump's fate on Facebook

(CNN Business) The decision not to allow former President Donald Trump back on Facebook is the biggest, most contentious and most controversial single content moderation determination the company has ever made. So who made that call? Well, surprisingly, not company founder and CEO Mark Zuckerberg.

The decision was made by the Facebook Oversight Board, an independent body often described as a kind of Supreme Court for Facebook. The board's decision, announced Wednesday, upheld Facebook's move to suspend Trump from its platforms, but said Facebook could not impose what it called the "arbitrary" penalty of an "indefinite" suspension. The company must review the case and decide on a definite time period for the suspension -— a permanent ban would be allowed, so long as it is not indefinite — within six months, the board ruled.

If the idea of a Supreme Court for a social network leaves you with a lot of questions, well, you're not the only one. Below, some frequent questions and answers about the board to help you get up to speed.

Remind me, what happened to Trump's Facebook account?

Trump had access to his Facebook and Facebook-owned Instagram accounts cut off on January 7, a day after the deadly insurrection in Washington DC. Zuckerberg wrote at the time, "We believe the risks of allowing the President to continue to use our service during this period are simply too great." Trump's accounts were suspended indefinitely.

Facebook later referred the case to its independent Oversight Board.

What did the board decide?

The board upheld Facebook's decision to suspend Trump's accounts, writing in its decision that two posts on Jan. 6 from the former President "severely violated Facebook's Community Standards and Instagram's Community Guidelines," which prohibit posts that praise of people engaged in violence.

However, the board ruled that Facebook's imposition of an "indefinite" suspension was inappropriate, as "indefinite suspensions" are not described as a potential outcome in Facebook's content policies. The board gave Facebook six months to reevaluate the action taken on Trump's account and to apply some consequence consistent with its own rules.

Facebook Vice President of Global Affairs and Communications Nick Clegg said in a statement after the ruling that Facebook will "consider the board's decision and determine an action that is clear and proportionate."

So what exactly is the Facebook Oversight Board?

The board is an independent, court-like entity for appealing content decisions on Facebook-owned platforms. It's made up of 20 experts in areas like free expression, human rights, and journalism.

Content moderation decisions -- for instance, removing or not removing a particular post -- made by Facebook and Instagram can be appealed to the board once users have gone all the way through the company's internal review process. Facebook says that decisions made by the board are final.

Facebook first announced its intention to form an independent entity to vet content decisions in November 2018. After some delay, the company announced in October 2020 that the board would begin to hear cases.

Who is on the board?

Included among the 20 current members of the board are notable individuals from around the world, including Helle Thorning-Schmidt, former prime minister of Denmark; Alan Rusbridger, former editor-in-chief of The Guardian; and Tawakkol Karman, a Nobel Peace Prize laureate who promoted non-violent change in Yemen during the Arab Spring, a movement in which social media played an important role.

But the board just does whatever Facebook wants, right?

Nope. The board is designed to be independent of Facebook, according to its charter. Facebook funds a trust that, in turn, funds the board. The trustees are "responsible for safeguarding the independence" of the board.

Critics of the company argue the board is not truly independent and is a "Facebook-paid, Facebook-appointed body created by Facebook to use to launder its most politically sensitive decisions."

Suzanne Nossel, a Facebook Oversight Board member and CEO of the free expression organization PEN America, told CNN Business last week, "Obviously, Facebook has its own motives in this. Let's be clear. They're a profit-making enterprise. They wouldn't have done this if they didn't think it was good for business. They have taken some steps in putting money in a trust and creating an independent set of trustees that oversee the board itself. And so there are some efforts to make it genuinely independent."

"Whether those go far enough, whether circumstances arise that test or challenge those parameters, we'll have to see, but I think it's crucial, if the board is going to play any kind of useful role, that that independence be absolutely respected," she added.

Some -- perhaps many -- decisions the board makes may ultimately not be what Facebook would want, or might put the company in some uncomfortable positions. But however the board rules, Facebook does theoretically get the benefit of some cover on the most difficult content questions. The board's decision on Trump, however, didn't give the company as much cover as it might have wanted.

Does Facebook have to do what the board says?

A decision made by the board "will be binding and Facebook will implement it promptly, unless implementation of a resolution could violate the law," according to the board's charter.

What cases has the board taken on before this?

In its first set of rulings in January, the board overturned some decisions Facebook had made.

In one case, Facebook had removed a post from a user in Myanmar who had shared two photos of a Syrian toddler of Kurdish ethnicity who drowned attempting to reach Europe in 2015. The text accompanying the photo, according to the board's description, said there was "something wrong with Muslims (or Muslim men) psychologically or with their mindset." (Rohingya Muslims have been persecuted in Myanmar.)

Facebook removed the post due to its hate speech policies. The board overturned that decision.

In an explanation of the decision posted to its wsite, the board said, "[W]hile the post might be considered pejorative or offensive towards Muslims, it did not advocate hatred or intentionally incite any form of imminent harm. As such, the Board does not consider its removal to be necessary to protect the rights of others."

You can read the full decisions here.

-- Brian Fung and Kaya Yurieff contributed reporting.

Paid Partner Content