There are serious concerns about the scope of powers that the Board has been given. Even though the Board’s decisions on individual cases are binding on Facebook, the Board has to adhere to Facebook’s Community Standards and other written policies in its rulings.
The debate over content moderation has become an increasingly crucial one for most social media platforms as well as most jurisdictions across the globe. Determining which content needs to be taken down and what can be left up has become a point of contention between internet companies, governments and civil society groups. While social media platforms have often chosen to err on the side of caution, calls to protect free speech have prompted some of them to rethink the content they choose to remove from their platforms. This was highlighted by a lawsuit filed against Facebook in France, over their takedown of ‘offensive’ artwork that was posted by a user on the platform. Since then, Facebook has begun to rethink the mechanisms through which they moderate content online, in order to attempt striking a balance between free speech and making Facebook a safe space.
In November 2018, Facebook CEO Mark Zuckerberg published an opinion piece outlining the updates to the content moderation philosophy that Facebook was undertaking. A key topic within the blueprint was the setting up of an independent oversight body for appeals to Facebook’s content moderation decisions. This idea was developed by Facebook to create the Oversight Board, or what has been popularly referred to as the ‘Facebook Supreme Court’, an independent body that will be used to adjudicate on appeals made to Facebook’s content take-downs and to inform and enhance Facebook’s moderation policy. Through a string of consultations, both internal and external, Facebook has established the framework for this body via a charter published in September 2019.
Cases can be referred to the Board either by Facebook itself or by aggrieved users. According to Facebook, the Board will review matters that have “significance” in determining the outcome of potentially dangerous situations or building public discourse and matters where Facebook is conflicted in their interpretation of the Facebook Community Standards. The Board’s decision on such appeals will be binding on Facebook, however, upholding the precedent set by the Board in determining take-downs of other posts of a similar nature will be done via Facebook’s own volition. Additionally, while the Board can give Facebook policy guidance based on the subjects it reviews in its cases, the decision to implement the advice into practice rests solely with Facebook.
There are serious concerns about the scope of powers that the Board has been given. Even though the Board’s decisions on individual cases are binding on Facebook, the Board has to adhere to Facebook’s Community Standards and other written policies in its rulings. This means that the Board effectively cannot overturn controversial policy decisions taken by Facebook, such as recent ones on fact checks on political speech. The Board can merely interpret the Community Standards, however, has no scope to rule based on principles enshrined in international treaties such as the International Covenant on Civil and Political Rights.
This is the first independent oversight mechanism created by a platform of Facebook’s size, meaning there is no prior data to predict the level of success this Board can achieve. Before setting any such expectations, a crucial question that needs to be answered is what metrics can be used to judge the impact that the Board has had on the platform. Efficiency of the Board is a concern that several have; the Board members are part-time employees and there are no defined rules on the length and frequency of hearings held by the Board or the number of cases that the Board will review in a particular period. According to Facebook, in Quarter 1 of 2019 alone, they took action against over 70 million instances of objectionable content, figures which do not include spam or fake accounts. Even if only a small percentage of these are submitted to the Board for review, the Board will have a high volume of cases to pick from. The percentage of such cases that are picked and reviewed will significantly determine the scale of impact that the Board has on how Facebook operates.
The second metric that must be used, not just as a measure of the Board’s impact but also as a measure of Facebook’s willingness to allow outside regulation is the changes to Facebook’s moderation policy that are brought about by the Board. Facebook has not kept it binding upon itself to adopt policy recommendations from the Board or follow the precedent set by it as they do not want to completely allow an outside body to control its fate. However, this does not mean that Facebook is necessarily averse to allowing the Board to influence its policy. Understanding how deeply the decisions of the Board has changed Facebook’s policy for its proactive decision making as well is a good way to gauge the success of the Board itself.
Thirdly, understanding how the Board’s decisions inform and shape public discourse on the topic is essential to fully grasp the extent of the Board’s impact. Facebook has said that all of the Board’s decisions will be available for public viewing after redacting any data that might breach the privacy of users. This raises questions as to how much data can actually be extracted given that the content in question is essential for contextualising debate as well as the Board’s decisions, yet the Board’s interpretation of Facebook’s Charter will be apparent in their rulings. Facebook has recently begun to disclose more data on how they moderate content, which has prompted informed discourse on the appropriateness of their process and the Board’s rulings are likely to further inform these discussions.
The structure modeled by Facebook is a big step forward from the mechanism being used by most social media platforms for content moderation. It is plausible that the Board has been set up to divert the heat that Facebook has received from governments, users and civil society groups. In allowing an independent body to influence moderation policy, Facebook may be trying to delay calls for legislative regulation of the platform. Facebook has drawn flak for being able to take down content proactively and this may be an attempt to stave off cases like the one in France and a more recent one in Australia that attempts to hold them responsible for moderation actions taken. The creation of such a Board may create a situation like the one with the National Collegiate Athletic Association (NCAA) where athletes have been unable to win lawsuits against colleges as the rule makers are independent of the rule enforcers. This might also be Facebook’s attempt to fend off attacks on Section 230 of the Communication Decency Act (rules on Intermediary Liability) and a flurry of anti-trust investigations that have been initiated against the firm.
It is also important to note that Facebook has clearly prohibited the Board from taking up reviews that may open Facebook to regulatory sanctions or ones that have been requested by administrations citing laws. Facebook and its employees have mentioned in numerous interviews that the safety of the company’s operations in jurisdictions is something that the company has given priority. Hence, even though the Oversight Board may influence Facebook’s guidelines, in certain jurisdictions where free speech is suppressed often, the Board will have little ability to enforce principles of free speech in Facebook’s moderation actions.
On the whole, Facebook certainly has motives for allowing the Oversight Board to inform and influence its decision-making process when it comes to moderation in countries which have relatively strong legal protections for free speech. If Facebook does allow the Board to significantly alter its policy, it might also change the debates around legislative regulation of content which can greatly benefit Facebook and prompt other platforms to innovate with their moderation mechanisms. However, in certain regions where Facebook believes in erring on the side of caution to ensure the safety of their operations, the Oversight Board is unlikely to make much difference to how content is moderated on the social media engine.
The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.
Saipriya Salla Program Associate Aspen Network of Development EntrepreneursRead More +