In 2018, Mark Zuckerberg announced the creation of the Facebook Oversight Board, which serves as the platform's own Supreme Court. The board commenced its operations on October 22, 2020, and this blog post aims to provide insights into its responsibilities and accomplishments during its first year of operation.
Facebook’s handling of content published on its platform has been discussed by academia and the general public for some time, especially against the background of the spread of so-called fake news or hate speech. After German lawmakers took action with the Network Enforcement Act (NetzDG) and similar regulations are being discussed at the European level within the Digital Services Act (DSA) framework, Facebook also took action. With the Facebook Oversight Board (FOB), the US company has created a body which, as an independent supervisory authority, is to review the current decision-making practice in dealing with published content and make suggestions for improvement. After its announcement in 2018, the development of the FOB was accompanied by experts from 88 countries. Around 1,200 public submissions with recommendations on the design of the FOB were considered before the final constitution of the body was published in September 2019. Although still under construction, the FOB officially began its work on 22 October 2020 and announced its first decisions on 28 January 2021.
Development and tasks of the Facebook Oversight Board
In its first year of operation, the FOB has grown from eleven to 19 members but remains well below the target size of 40 members set out in the constitution. These members should have a broad knowledge of moderating online content and be familiar with digital content. However, the goal formulated in the statutes to include members on the board who have and demonstrate a “broad range of knowledge, skills, diversity and expertise” and reflect the “diversity of the Facebook community” has been met. Thus, the members are spread across 16 different countries, with the United States being the only country of origin with four members with multiple representations. The European Union member states are represented by three members from Denmark, Hungary and France. Of the nine women and ten men on the board, according to the FOB’s information, 14 members have a legal background, but journalists, media scholars, politicians and activists are also represented.
The FOB’s core task is to make content decisions as a second instance, either after it has been called upon by users in the context of a complaint or when Facebook itself submits cases to the panel. It checks whether the decision made by Facebook staff during the moderation process aligns with Facebook’s content guidelines and values. In addition, the board may issue opinions that include recommendations regarding Facebook’s content policies. The FOB focuses on the area of expression law. Other sensitive and much-discussed areas, such as Facebook’s News Feed ranking or political advertising, do not fall within the FOB’s remit.
Looking at the statistics of the first year of work, 21 cases were accepted, and 18 were decided. However, the FOB has not yet heard one case with German participation this year. In eleven of these cases, the FOB did not uphold Facebook’s original decision. Four cases were submitted to the FOB by Facebook itself, including the decision on whether the blocking of the account (so-called de-platforming) of former US President Donald Trump was permissible under Facebook’s community standards. The remaining 17 cases are the result of complaints by users. The majority of the decisions concerned hate speech, but the FOB also dealt with cases of incitement to violence or the attempted sale of “regulated goods” such as drugs.
Standards of the Facebook Oversight Board
The FOB’s decision on the compatibility of Donald Trump’s de-platforming with Facebook rules has undoubtedly received the most comprehensive media attention. This is even though it was an atypical case that Facebook did not necessarily have to submit to the FOB. In addition to this prominent case, however, the FOB has also decided on several other exciting cases – for example, on 28 January 2021, a case that is also interesting from a German perspective. In the US presidential election run-up, a user shared a quote attributed to Nazi propaganda minister Joseph Goebbels. The quote included the thesis that it is more effective to appeal to the emotions and instincts of voters than to their intellect; moreover, truth must be subordinated to tactics and psychology. While the user wanted to use the quote to compare to Donald Trump’s presidency in election times, Facebook deleted the post with reference to its guidelines on dangerous persons and organisations. Facebook justified the decision by saying that the user had not made it sufficiently clear that he did not support Joseph Goebbels. However, the FOB did not share this view and ruled that the post had to be restored. The FOB based this on the lack of compatibility of Facebook’s decision with international human rights standards. In addition, it criticised the vagueness and indeterminacy of Facebook’s rules and found that the decision was disproportionate due to the lack of context sensitivity.
The invocation of international human rights standards, as a supplement to or even in lieu of Facebook’s own policies and values, is a trend that characterises this decision and is also reflected in other FOB decisions. Incorporating these standards could be an essential step towards increasing the body’s emancipation from Facebook in the future and ensuring its independence. It could also help the FOB develop globally uniform standards for what may and may not be communicated on Facebook if such a global solution is found.
Since Mark Zuckerberg announced his intention to set up the FOB, the project has been criticised. Above all, accusations have been made that it is merely a diversionary manoeuvre to avoid more far-reaching regulation, such as that which the DSA could impose. It is also argued that the FOB’s independence and influence on Facebook are too weak to lead to lasting changes. Especially because important issues, such as the use of algorithms, are not covered by the FOB’s mandate. Moreover, it is questionable whether Facebook will implement all the decisions of the FOB. There is a danger that Facebook will ignore the disagreeable decisions of the FOB or abolish the entire body. In a way, Mark Zuckerberg represents the constitutional authority behind the FOB so that future developments remain exciting.
Despite the negative points of criticism, however, it should not be forgotten that with the introduction of the FOB, users for the first time have an internal legal remedy in their hands with which they can take action against Facebook’s decisions without having to seek a dispute before the state courts. This is also reflected in the fact that the majority of cases decided by the FOB were such complaints, not just claims submitted by Facebook in its own interest. The fact that the FOB “overturns” Facebook’s original decision in many cases is also a good sign for users. In this respect, the introduction of the FOB is, in any case, a step forward compared to the non-transparent status quo, which can also lead one to be confident about the further development of the FOB. The fact that the FOB decided to meet with whistleblower Frances Haugen after the revelations of recent weeks is also a positive sign. The body wants to use her experience and insights to create stronger momentum regarding transparency and accountability. However, it is crucial to keep in mind here that while the FOB as an internal compliance mechanism can improve the protection of users’ rights, it can only complement and in no way replace state legal protection.
The blog posts published by bidt reflect the views of the authors; they do not reflect the position of the Institute as a whole.