Trends

Facebook’s ‘independent oversight group’ is destined to fail

If there was any doubt about how big Facebook has gotten, CEO Mark Zuckerberg compared the struggles Facebook faces with deciding what kind of content to allow on its platform to the struggles cities face in mitigating crime.
“No one expects crime to be eliminated completely, but you expect things will get better over time,” Zuckerberg said in a conference call with reporters today.
That’s not the only government analogy Zuckerberg made. During the call, Zuckerberg published a 4,500-word note about the struggles the platform has historically faced in policing content on its platform, as well as where it hopes to go from here.
In it, Zuckerberg announced that the company is finally taking steps to create an “independent oversight group” next year — which he’s previously compared to a supreme court — that users could petition to appeal decisions made by content moderators. The idea being that, Facebook should take into account a broader array of perspectives when making content moderation decisions.
There’s no question that, when you’re dealing with billions of users publishing tens of millions of photos, videos, and text each day, having more people decide  what should and shouldn’t be allowed will be helpful. And Zuckerberg stressed that “there a lot of specifics about this that we still need to work out.”
But Facebook hasn’t clearly yet established what it feels like a quasi-independent body could do that it couldn’t do on its own, and this announcement feels like a way to give users someone else to blame when things go wrong besides Facebook itself.
Let’s first take a look at how Zuckerberg describes the team that Facebook currently has working on developing the policies of what is and isn’t allowed, from his Facebook note:

The team responsible for setting these policies is global — based in more than 10 offices across six countries to reflect the different cultural norms of our community. Many of them have devoted their careers to issues like child safety, hate speech, and terrorism, including as human rights lawyers or criminal prosecutors.
Our policy process involves regularly getting input from outside experts and organizations to ensure we understand the different perspectives that exist on free expression and safety, as well as the impacts of our policies on different communities globally. Every few weeks, the team runs a meeting to discuss potential changes to our policies based on new research or data. For each change the team gets outside input — and we’ve also invited academics and journalists to join this meeting to understand this process. Starting today, we will also publish minutes of these meetings to increase transparency and accountability.

When asked during the call about what types of people Facebook said that it would hope to put on this independent oversight board, Zuckerberg and Facebook’s global head of policy management Monika Bickert said that they also hoped to staff this group with academics and experts with experience on issues of hate speech, free speech, and other relevant topics.
If the independent board will be made up with people who have comparable expertise as the people they’re already talking to, then what could these people do on an independent board that they couldn’t do by talking to Facebook’s policy team?
The idea is that this independent board, in theory, could make decisions on what type of content to keep up or remove without regard to Facebook’s commercial interests. But even with the loosely established parameters that Zuckerberg and Facebook have already set for this independent board, it’s difficult to see how that can happen.
For example, Zuckerberg stressed that Facebook would still handle the initial decision about whether or not to take down or keep up a piece of content or an account, and then handle the first appeal. If the user appealed, and they were dissatisfied with that decision, then would have the ability to appeal to this so-called called supreme court. In other words, the board couldn’t make Facebook take down content that a user hasn’t already reported.
Facebook wouldn’t need to worry about appointing people who could harm their commercial interests, because the parameters of the board would ensure that it couldn’t take many actions that could harm Facebook’s commercial interests — they could only respond to individual user requests, not push for sweeping changes.
I’d be more receptive to the idea that an independent board might make for a fairer content moderations process if Facebook had been more upfront about its own shortcomings — but they haven’t proven to be so far.
In regards to allowing Russia-linked trolls, Facebook’s repeated talking points have been that “it was too slow to act.” Zuckerberg infamously dismissed the idea that fake accounts and disinformation may have effected the 2016 U.S. presidential election as a “crazy idea.” And Facebook has struggled over the past year to explain the rationale behind some of its content moderation decisions in a trustworthy manner, initially saying that notorious troll Alex Jones hadn’t violated its policies, only to take down his account after companies like Apple and Spotify banned his podcasts.
When asked today by a reporter about why he thinks he’s still the best person to fix Facebook, Zuckerberg responded, “I think we’re doing the right things to fix the issues…I don’t think me or anyone else could come in and snap their fingers and have these issues resolved in a quarter or half a year.”
Repeatedly, Facebook’s response in the face of criticism has been that it’s operating as best as anyone could expect a company to when dealing with these problems of foreign propaganda and hate speech exacerbated by technology.
Facebook has refused to concede that problems with its business model or executive hires may have worsened these problems — or the notion that being broken up or regulated may help mitigate them. And willingness to accept brutal criticisms is needed if you’re going to honestly listen to an independent board.
Rather, it seems to me that the push to create an independent oversight board is to accomplish two things — one, to try and avoid government regulation for as long as possible. Second, to give trolls someone else to gripe about when they complain about Facebook taking down Alex Jones’ accounts or “shadow-banning” conservative news sites.
Facebook needs help from an independent body to course correct but until the so-called independent board they plan to appoint is given teeth and car-blanche, it doesn’t appear the one proposed today is up for the task. If Facebook isn’t careful, that course correction could come from governments in an anti-monopoly or privacy regulation akin to GDPR.
Source: VentureBeat
To Read Our Daily News Updates, Please Visit Inventiva Or Subscribe Our Newsletter & Push.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker