How to Create a Content Moderation Court
With more than 2 billion users, it became increasingly clear to Facebook that it shouldn’t be making so many decisions about speech and online safety on its own. The Oversight Board was created to help Facebook answer some of the most difficult questions around content moderation. Given the binding authority to review Facebook’s content decisions, the Board needed to set up an institution to support people’s right to free expression and ensure human rights are being adequately respected. The Board will have heard several cases by March 2021 and the panelists will discuss the challenges and opportunities posed when creating an institution to address some of the toughest issues to upholding human rights online. Join us to learn more about the Board's progress and what's next.
Share this idea
- What were some of the initial challenges when creating an institution to examine content moderation decisions?
- Some have referred to the Oversight Board as "Facebook's Supreme Court." Is this a fair characterization? How are you different?
- The Oversight Board has been hearing cases from Facebook and users themselves for several months now. What are some of the key lessons learned?
Rachel Wolbers, Public Policy Manager, The Oversight Board