The hidden side of politics

Real Facebook Oversight Requires More Than a 40-Expert Board

Reported by WIRED:

When Facebook announced in November that it would launch an independent oversight board, questions arose about what that might look like and how it would work. Facebook CEO Mark Zuckerberg, for one, compared the would-be governing body to the Supreme Court, in its potential capacity to review the biggest issues of the day and set a sort of Facebook case law. On Monday, Facebook released a draft charter answering questions about how such an institution might function. Much of it is still undecided, but one thing is clear: To compare Facebook’s board to the Supreme Court is to minimize the sheer complexity of what Facebook is setting out to accomplish.

With the US Supreme Court, the founding fathers created an über-powerful judicial body to enforce a constitution that now governs 325 million residents of the United States. That’s 325 million people whose beliefs and values may vary, but who are all subject to the same set of laws, who largely speak the same language, and who share at least some cultural norms or heritage. Up to 8,000 cases are filed with the Supreme Court each term. The court takes up just about 80 of them.

Contrast that with what Facebook is trying to do. In its charter, the company suggests creating a body of up to 40 “independent experts” to review Facebook’s most contentious content moderation decisions, in order to cast the final vote on whether a given post or comment should stay or go and how that should alter Facebook’s policies in the future. This board would be choosing not from thousands of cases each term but potentially several million cases every week. And its decisions would affect the world’s 2.3 billion Facebook users, a population that’s roughly seven times the size of the United States.

Facebook’s desire to create an external review board makes sense. As the company’s unprecedented power has grown, trust in it has waned. In the proposal, Facebook writes that it set out to create this committee because “we have come to believe that Facebook should not make so many of those decisions on its own.” In other words, Facebook wants to distribute some of the unprecedented power it holds—and some of the blame it gets. But in practice, Facebook may end up merely shifting that power and concentrating it in the hands of another less tested group, whose very structure and operating philosophy is being designed by none other than Facebook.

In a blog post introducing the proposal, Facebook’s vice president of global affairs, Nick Clegg, emphasized that the draft proposal is merely a starting point for a broader public discussion about how the board ought to operate. “We look forward to getting additional input in how best to build a board that creates accountability and oversight of our content policy and enforcement decisions and will continue to share milestones and progress,” Clegg writes.

Facebook’s own uncertainty about how this system will work is evident throughout the proposal, which lays out its broad contours. For instance, the company notes that a 20-person board might enhance camaraderie among the group, while a 100-person board would allow for a wider range of viewpoints to be represented. Facebook therefore suggests forming a 40-person group, without explaining why that number is meaningful, except that it happens to be in the middle of two numbers Facebook arbitrarily suggested.

Facebook will pick the initial cohort, saving it from the Kafkaesque process of drafting a separate committee to pick the final committee. After that, the members will pick their own successors, none of whom may be Facebook employees, past or present, or government officials. Board members will split into smaller panels to deliberate on individual cases and will choose their own dockets, based on referrals from Facebook users and from Facebook itself. Their collective decisions will be public, but their individual votes won’t be. They’re prohibited from taking incentives that may be pertinent to the cases before them, but they will be paid throughout the course of their three-year, part-time terms.

Of course, there’s no enforcement mechanism for this type of system. There’s no nationwide database for Facebook board lobbyists—not that I’m suggesting one. Facebook is simply asking the public to trust that all of this will ensure its decisions are made in an unbiased, thoughtful manner. The problem is, trust in tech is a resource in short supply these days.

Over the next several months, Facebook plans to hold workshops in Singapore, Delhi, Nairobi, Berlin, New York, Mexico City, and other cities around the world, where they’ll solicit feedback about this proposal. For a company more accustomed to moving fast and breaking things, such considered outreach to the public is certainly a welcome change. But even this attempt at outside supervision is still largely under Facebook’s control. Meanwhile, Facebook has cracked down on outside efforts to monitor what’s happening on the platform. A ProPublica tool that collected information on political ad targeting recently broke after Facebook changed its API in what the company says was a security measure.

With the board, Facebook is designing oversight in its own image. In doing so, it’s also opening itself up to a new wave of criticism about which 40 experts it considers to be worthy of the public trust. It’s already been repeatedly ridiculed for enlisting the conservative Heritage Foundation to assess allegations of partisan bias on the platform.

The very fact that the company is seeking a diversity of viewpoints all but ensures there will be disagreements about who qualifies—and that’s before they’ve cast a single vote. Even the nine justices of the Supreme Court, who are confirmed only after a centuries-old bipartisan vetting process in Congress, are largely defined today by their partisan allegiances, making some of their most consequential decisions appear to be little more than a numbers game. There’s no reason to think Facebook’s board wouldn’t be plagued by the same ideological infighting.

That’s not to say there isn’t value in having an extra set of eyes on decisions that Facebook’s far-flung moderators sometimes make in a matter of minutes if not seconds. But no team, no matter the size or scope, could ever adequately consider every viewpoint represented on Facebook. After all, arguably Facebook’s biggest problem when it comes to content moderation decisions is not how it’s making the decisions or who’s making them, but just how many decisions there are to make on a platform of its size.

In seeking to fix one unprecedented problem, Facebook has proposed an unprecedented, and perhaps impossible, solution. No, the decisions Facebook’s supreme court makes won’t dictate who’s allowed to get married or whether schools ought to be integrated. But they will shape the definition of acceptable discourse on the world’s largest social network. They’ll define what sort of speech constitutes hatred and violence and will have a say in whether or not it’s permitted to spread.


More Great WIRED Stories

Source:WIRED

Share

FOLLOW @ NATIONAL HILL