Facebook’s new oversight board: Will it help?

Anyone who has had much experience with reporting illegal or harassing posts on Facebook will attest that it’s a profoundly discouraging experience. In the four years since Sabine McNeill released videos of P and Q, many of our commenters have reported hundreds if not thousands of posts to Facebook, only to receive the infuriating response, “Thanks for trying, but this doesn’t violate our community standards”.

When we do manage to get posts removed, it’s cause for celebration. At a rough estimate, we’d say that about one of every 20 posts reported is actually removed. It’s a shameful reflection on Facebook, which claims to care about its users’ safety.

In November, Facebook announced a new plan to change the way it makes decisions about the content its users post. According to Techcrunch,

[A] new external oversight committee would be created in 2019 to handle some of Facebook’s content policy decisions. The body will take appeals and make final decisions. The hope is that beyond the influence of Facebook’s business imperatives or the public’s skepticism about the company’s internal choices, the oversight body can come to the proper conclusions about how to handle false information, calls to violence, hate speech, harassment and other problems that flow through Facebook’s user-generated content network.

Emphasis ours

The proposed structure of the reporting system will mark a shift from the “one-shot” approach currently in use:

Zuckerberg describes that when someone initially reports content, Facebook’s systems will do the first level of review. If a person wants an appeal, Facebook will also handle this second level of review and scale up its systems to handle a lot of cases. Then, he says, “The basic approach is going to be if you’re not happy after getting your appeal answered, you can try to appeal to this broader body. It’s probably not going to review every case like some of the higher courts . . . it might be able to choose which cases it thinks are incredibly important to look at. It will certainly need to be transparent about how it’s making those decisions.

Yesterday, Facebook released a draft charter for its Oversight Board for Content Decisions.

According to this document, the proposed board will comprise up to 40 “global experts”, whose names will be public. They will have experience in “content, privacy, free expression, human rights, journalism, civil rights, safety, and other relevant disciplines”.

The first cohort of members will be chosen by Facebook, with consideration given to geographic and cultural balance, as well as to diversity of backgrounds and perspectives. Once the board has been launched, future board members will be chosen by current members.

Cases will be heard by panels formed from a rotating set of an odd number of members (to avoid tie decisions). Each panel will consider a “docket” of cases.

While all of this this sounds encouraging, we’re learned over the years not to get our hopes up.

For one thing, even given that the Oversight Board would only see cases which have been reviewed and rejected by Facebook’s standard review team, and then appealed by the complainant, Facebook has over 2 billion users worldwide. The idea that 40 board members could handle the volume of cases which could arise seems just a bit unrealistic.

We’re also a bit concerned by the skill set required of the board members. While we do believe in freedom of speech and expression, and think it’s important that these be considered, we don’t see any reference to expertise in cases of online harassment or abuse.

Nowhere do they mention illegal posts, or posts which violate court orders or reporting restrictions. Facebook’s policies to date have focused on the acceptability of posts within the company’s own terms of service, ignoring whether the posts violate the law in any given country. This is a major flaw, and needs addressing.

We will watch with interest as this new approach to reporting unfolds, and will be interested to learn our readers’ experiences.

20 thoughts on “Facebook’s new oversight board: Will it help?

  1. Excellent post, and may I say FB would do well to get some input from people who have actually been harassed already?

    Liked by 3 people

  2. I have my doubts still, I suspect this is another ‘facebooksaving’ cough facesaving exercise on their behalf
    It’s a standing joke that facebooks ‘standards’ are pretty bloomin low considering what they allow to remain even after complaints are made…

    Liked by 3 people

  3. It will still take too long. It can days to get a response to the first report, meaning that even under the new system the offending post will still be up for days even a week before it gets removed.

    Liked by 2 people

    • Yes, the scale of the thing is a bit mind-boggling. How they expect 40 people to deal with 2 billion, even if they only see the shit that rises to the top, is a bit mystifying to me.

      Liked by 1 person

      • Facebook must employ some very clever computer savvy people so i can’t understand how they still have such a crap not fit for purpose reporting tool after all these years.

        Liked by 3 people

  4. The reporting on FB has changed in the past few days, I don’t particularly like it, 3 screens to report some things, it feels like they’re telling you not to be troubling them by reporting something. The team that thought this up need to speak to real people that have been harassed, defamed & have had their lives threatened on FB.

    Liked by 2 people

  5. As there’s an understandable lull, more threats from the Stephen Yaxley-Lennon crowd to report, this time with respect to a colleague of Stephanie Finnegan who you may remember getting threats for accurately reporting the Melanie Shaw protests in Leeds last year. A little less patrolling and a few more arrests would be a more efficient use of police time.

    https://www.holdthefrontpage.co.uk/2019/news/journalist-told-she-would-pay-ultimate-price-after-covering-far-right-protest/

    Liked by 2 people

  6. His followers are really looking at everything through tunnel vision, they have no respect for anyone else’s view. I do hope that email can be traced, those thugs need to realise that threats are taken seriously online or offline.

    Liked by 1 person

    • His influence is now being felt at UKIP and at Brighton as the try to compete with that political candidate on a harassment prevention order.

      Like

  7. My personal feeling is that someone should step up and take a leadership role regarding bringing these tech giants such as Facebook to task. That’s why I was excited to learn about an Irish company called depublish.com, actually that name/domain is currently up for sale….what I meant to write is https://depublish.ie/….It should be an interesting innovation for all of us here. They are an example of a way forward…I think they are more interested in the harm done to under 18s currently, school-going children who are being bullied by their peers.

    Anyone of us here could establish something similar?

    Facebook drives me mad with their lackadaisical approach – it’s clear that their imperative revolves around encouraging clickbait.

    Liked by 3 people

  8. I still say governments have to beef up or even implement cyber stalking laws. In this case it was not the ‘revenge porn’ that got the bloke jailed: it was the harassment.

    U.S. District Court Judge Marcia A. Crone sentenced Vincent G. Provines, 24, an ex-U.S. Marine reserve officer to almost three years in prison last week on felony cyberstalking charges for his offenses against Kate, a 23-year-old civilian from Dallas, Texas, and her parents. Newsweek is withholding their names to protect their privacy.
    https://www.newsweek.com/marine-officer-sentenced-revenge-porn-1310045

    Liked by 1 person

    • “Upon release, he will face two years of probation, wherein he is prevented from purchasing, possessing or having contact with any electronic device that connects to the Internet, according to court documents reviewed by Newsweek. The court has not yet ruled on whether to fine Provines up to $250,000 for his crime.”

      Like

  9. The oddest thing is, there was a court order, that caused some providers to remove stuff, why don’t they honour that still.

    Liked by 2 people

Comments are closed.