just can’t seem to win when it comes to what content it leaves up or pulls down from the social network, constantly drawing scrutiny from lawmakers, journalists and advocacy groups. So the world’s largest social network has been working on a potential solution: similar to a supreme court that would oversee some of its toughest content moderation decisions.
On Tuesday, Facebook laid out how its content oversight board, whose decisions can’t be overturned by the company, will work in a final charter. The company still has other details to work out, but the release of the final charter signals that Facebook intends to move forward with its plans. Still, Facebook will have to prove to the public that the board is truly independent and that it can safeguard user privacy as the new body weighs in on some of its most controversial decisions.
In a statement, Facebook CEO Mark Zuckerberg said the board’s decisions were final.
“The board’s decision will be binding, even if I or anyone at Facebook disagrees with it,” Zuckerberg wrote. “The board will use our values to inform its decisions and explain its reasoning openly and in a way that protects people’s privacy.”
The board will start out with at least 11 members and likely grow to 40 members total, according to the charter. Facebook will select a group of co-chairs who will then select candidates for the board. The members will serve part-time for a three-year term and be paid through a multimillion dollar trust. It’s expected to hear dozens of cases in its first year, but the board could expand its load to thousands of cases as it grows.
Facebook plans toby the end of this year. Next year, it will start hearing cases that the company brings forth first. The social network’s users will be able to appeal to the board during the first six months of 2020. Facebook’s oversight board will prioritize cases that are significant, meaning they affect a large number of people, fuel public debate or threaten someone’s safety or equality, the company said. They’re also looking at content decisions in which people disagree about the outcome.
The board will be able to make policy recommendations to Facebook, which could affect all of the social network’s users. Facebook has 2.4 billion monthly active users worldwide. The company said it’ll follow the board’s decisions even if it disagrees with the result unless doing so would violate the law.
“We expect that that will happen once this board is up and running and that will establish that it is independent,” said Brent Harris, Facebook’s director of governance and global affairs, during a conference call on Tuesday morning.
Harris said the company was looking at people from different backgrounds and with various political viewpoints. They could include former lawyers, judges, journalists and publishers who have experience in making decisions under a set of standards and working together in a group.
Facebook’s formation of a board comes at a time when the social network has been accused of suppressing, which the company denies. At the same time, it’s been under pressure to do more to combat hate speech, misinformation and other offensive content. This year, Facebook faced criticism after it decided not to remove a doctored video of that made her seem drunk.
Facebook said it set up the trust so other companies could join in the future. Other social networks such as Twitter and Google-owned YouTube haven’t announced plans for a content oversight board. Facebook said it set up the trust so other companies could join in the future.
Originally published Sept. 17, 1:00 p.m. PT
Update, 4:05 p.m.: Adds more information.