Anyone who has had much experience with reporting illegal or harassing posts on Facebook will attest that it’s a profoundly discouraging experience. In the four years since Sabine McNeill released videos of P and Q, many of our commenters have reported hundreds if not thousands of posts to Facebook, only to receive the infuriating response, “Thanks for trying, but this doesn’t violate our community standards”.
When we do manage to get posts removed, it’s cause for celebration. At a rough estimate, we’d say that about one of every 20 posts reported is actually removed. It’s a shameful reflection on Facebook, which claims to care about its users’ safety.
[A] new external oversight committee would be created in 2019 to handle some of Facebook’s content policy decisions. The body will take appeals and make final decisions. The hope is that beyond the influence of Facebook’s business imperatives or the public’s skepticism about the company’s internal choices, the oversight body can come to the proper conclusions about how to handle false information, calls to violence, hate speech, harassment and other problems that flow through Facebook’s user-generated content network.Emphasis ours
The proposed structure of the reporting system will mark a shift from the “one-shot” approach currently in use:
Zuckerberg describes that when someone initially reports content, Facebook’s systems will do the first level of review. If a person wants an appeal, Facebook will also handle this second level of review and scale up its systems to handle a lot of cases. Then, he says, “The basic approach is going to be if you’re not happy after getting your appeal answered, you can try to appeal to this broader body. It’s probably not going to review every case like some of the higher courts . . . it might be able to choose which cases it thinks are incredibly important to look at. It will certainly need to be transparent about how it’s making those decisions.
Yesterday, Facebook released a draft charter for its Oversight Board for Content Decisions.
According to this document, the proposed board will comprise up to 40 “global experts”, whose names will be public. They will have experience in “content, privacy, free expression, human rights, journalism, civil rights, safety, and other relevant disciplines”.
The first cohort of members will be chosen by Facebook, with consideration given to geographic and cultural balance, as well as to diversity of backgrounds and perspectives. Once the board has been launched, future board members will be chosen by current members.
Cases will be heard by panels formed from a rotating set of an odd number of members (to avoid tie decisions). Each panel will consider a “docket” of cases.
While all of this this sounds encouraging, we’re learned over the years not to get our hopes up.
For one thing, even given that the Oversight Board would only see cases which have been reviewed and rejected by Facebook’s standard review team, and then appealed by the complainant, Facebook has over 2 billion users worldwide. The idea that 40 board members could handle the volume of cases which could arise seems just a bit unrealistic.
We’re also a bit concerned by the skill set required of the board members. While we do believe in freedom of speech and expression, and think it’s important that these be considered, we don’t see any reference to expertise in cases of online harassment or abuse.
Nowhere do they mention illegal posts, or posts which violate court orders or reporting restrictions. Facebook’s policies to date have focused on the acceptability of posts within the company’s own terms of service, ignoring whether the posts violate the law in any given country. This is a major flaw, and needs addressing.
We will watch with interest as this new approach to reporting unfolds, and will be interested to learn our readers’ experiences.