Author: Pierre François DOCQUIR (PhD),
Head of Media Freedom at ARTICLE 19
Email: pierre@article19.org
Twitter: @pfd_FreeMedia
Dominant social media companies hold a considerable degree of control over what their users see or hear on a daily basis.[1] Current practices of content moderation offer little in terms of transparency and virtually no remedy to individual users. The responsibilities of the largest social media companies are currently being debated in legislative, policy and academic circles across the globe, but many of the numerous initiatives that are put forward do not sufficiently account for the protection of freedom of expression. Against this quickly-sketched background, the global free speech organization ARTICLE 19 has proposed the creation of the Social Media Council (“SMC”) – a model for a multi-stakeholder accountability mechanism that would provide an open, transparent, accountable and participatory forum to address content moderation issues on social media platforms.
A fundamental objective of the project consists in bringing the moderation of online speech back where it belongs: in an open, transparent forum. In order to fulfil that role, the SMC needs to be composed by representatives of the broad diversity of society, including vulnerable groups and minorities. The Social Media Council would be created by all relevant stakeholders, with the goal of applying human rights standards (possibly in the form of a Code of Human Rights Principles for Content Moderation) to the review of content moderation decisions made by social media platforms. The mechanism would not be in itself legally binding, and the participating social media companies would commit to executing the Council’s decisions in good faith.
As the UN Special Rapporteur on Freedom of Expression David Kaye wrote recently: “we need global standards for global platforms, not the discretionary terms of service or some imagined version of the First Amendment. Human rights law provides the right set of guarantees for free expression, privacy, nondiscrimination, and due process.” ARTICLE 19’s proposal also suggests that international standards provide the appropriate reference for the oversight of content moderation. As is generally the case with the application of international standards, a certain margin of appreciation would be part of the SMC mechanism: this will allow a differentiation, in the application of international standards, between different companies and their respective products (e.g., Facebook is different from Twitter), including the liberty of companies to adopt stricter restrictions on freedom of expression to accommodate their own editorial choices (although market dominance would result in a narrower margin of appreciation in this respect).
The Social Media Council is not the only idea that seeks to deal with the issue of content moderation as a matter of urgent democratic importance (see for instance the moderation standards council or the proposal of Global Partners Digital). Not only is this question dans l’air du temps, it is also emerging at the exact point of convergence between the goals and interests of human rights groups and social media platforms: avoiding the pitfalls of harsh legislative approaches that often come with disproportionate sanctions, contributing to restoring trust from users through transparency and accountability, providing an effective yet adaptable form of regulation that can easily accommodate the constant evolution of tech platforms, and ensuring that moderation of speech is done on the universal grounds of international law.
In a world where private actors have considerable power over public debates and at a moment where the Media Pluralism Monitor is being upgraded to include new standards to measure media pluralism online, the existence of an effective, accountable, open, participatory, independent mechanism to review content moderation on the basis of international standards should be one of the new indicators of the state of freedom of expression online.
We’re currently drawing a roadmap for the creation of Social Media Councils, which raises a fair amount of small and big questions (see for instance the comments from our friends at EFF). At the present stage of discussions, there are different visions of what the exact roles and functions of this new mechanism should be (see the report from our conference with the Global Digital Policy incubator at Stanford University). Rather than mutually exclusive, these various configurations can be designed as complementary initiatives towards the common goal of bringing content moderation practices in closer compliance with international standards on human rights. ARTICLE 19 has launched a public consultation to present and discuss the different possible orientations of a Social Media Council as well as some more technical issues such as the rules of procedure or the funding mechanism. Now is the time to help us shape the future of social media regulation: take a look at the detailed presentation of the SMC and share your thoughts (the consultation runs until 30 Sept. 2019): https://www.article19.org/resources/social-media-councils-consultation/
[1] Moore, Tambini, Digital Dominance, OUP, 2018.