By Meera Selva, CEO of Internews Europe and Melody Patry, Senior Director of Technology, Media, and Democracy.
How do you create an internet that works for everyone and supports freedom of expression while protecting our social fabric and human rights? And who creates this internet, regulates it and documents the asymmetric impact digital communications have on different groups.
UNESCO, which has a mission to defend freedom of expression and support independent media, is drawing up guidelines to create this kind of internet. We have been talking about this subject for years, and everyone, from the platform companies, to the World Health Organisation, to media companies, have all tried to grapple with the many issues.
Internews Europe and the government of Estonia gathered a group of journalists, academics, scientists, activists, technology companies, and media development organisations who have been looking at this issue for years to ask them what they learned.
1. Any decision on media regulation needs to be multi stakeholder, so it’s not decided by the same group of people talking to each other.
The internet is global. Digitalisation has created opportunities for marginalised voices to be heard, but it has also unleashed violence and harm that has had an asymmetric impact on different communities. The people most harmed by online attacks are often those with the least power in regulatory spaces. An international multi-stakeholder approach ensures all perspectives and needs are met.
2. Don’t over-regulate, as it kills creativity and facilitates censorship.
Regulation alone will not solve the problems of hate speech and misinformation, and regulatory mechanisms can reinforce existing privileges and power structures. In many countries there is a long history of media ownership concentration; political pressure to control the public discourse. Social media has led, in many cases, to more media pluralism. To protect this, regulatory frameworks should clearly state adherence to international human rights law and principles.
3. Technology is constantly evolving, so all regulation should be prepared to evolve and be under constant review.
Politicians seeking to regulate digital spaces need to take time to understand the technology and also how different groups in society interact with it, both domestically and abroad, and be prepared to keep their knowledge updated.
4. Don’t confuse media outlets and internet platforms when speaking about regulation and disinformation.
It is still important to make the distinction between media and social media. There is an interdependency in which internet platforms shape media distribution and outreach, but laws to prevent the spread of disinformation on social media should not be used to attack independent journalism and legitimate criticism of governments.
5. Media viability needs to be at the core of the issue.
Technology companies have upended traditional media business models. Regulatory efforts to create healthy information ecosystems should also look at ways to support local and public interest media and recognize that a financially viable media sector is vital for creating a well-informed, empowering digital space.
6. Pay attention in who monitors the implementation of regulation and make sure they are properly representative and resourced.
It’s important to make sure roles are followed properly. Often details on monitoring the implementation and enforcement of regulations are left out of regulatory frameworks and end up becoming an ad hoc task that falls onto civil society organisations and academics who are already stretched for time and resources and have no enforcement authority. Guidelines for regulation should include information about the technical features for implementation.
7. We need access to data to understand how information ecosystems work, and stakeholders need to come as partners, not beggars, to access that information.
The Covid 19 pandemic really highlighted the importance of understanding where people turn to for information. Much of the information on this is held by the large technology companies which define the terms under which others can access, understand and analyze the data. There needs to be a willingness to share data to ensure public interest news and information can reach the right audiences effectively. This needs to be combined with strong data protection for users.
8. Enable and create community voices. Don’t let regulation only be in the interests of those who wield power.
Digital transformation has empowered community voices, and there are many precedents of states using content moderation as a tool for censorship. Regulations should explicitly include remedies and safeguards for periodic review and include mechanisms for local civil society organizations to call out draconian bans. There also need to be effective mechanisms for communities to report hate speech and misinformation in all languages.
9. Transparency is vital.
We cannot solve the problems of disinformation, hate speech and violence-inciting content online – among other problems – without transparency of both information and motivations. There is currently little to no transparency about how amplification and recommendation algorithms work, and the methods of data collection use and profiling. Digital platforms should report and disclose more about their inner workings when it comes to content rules, enforcement procedures, remedies and appeals. While some information should be available and accessible to the public and users, sensitive information can be made available to vetted researchers, CSOs, independent auditors, regulators, and journalists who in turn need to be transparent about what they are seeking to achieve from this data, and about where their mandate comes from.
10. There is a shared responsibility to deal with this issue and address things when they go wrong .
We need a systemic approach to information disorder, looking at the root causes of the problems and separating out the symptoms from the disease. There is a shared responsibility to call out when groups are demonized, and when those in authority spread misinformation and seed chaos.