‘Online Harassment is Part of our Daily Lives’: How Gender-Based Harassment Drives Self-Censorship Online

Zoey Tung Barthelemy, Internews Private Sector Accountability Specialist, speaks about the impact gender-based harassment is having on online censorship.

Gender-based harassment increases the very real risk of violence against journalists and activists. It also has an immeasurable impact on our wider information environment, silencing vital voices and stunting civic discourse. Harassment comes in many forms, from sexualized comments and imagery to physical threats, or ‘doxxing’ by publishing personal information.

In late 2021 and early 2022, Internews brought together journalists, human rights defenders, and civil society representatives in 14 countries to discuss the impacts of global technology platforms in diverse information ecosystems. Detailed feedback from these meetings have been compiled by Internews’ platform accountability team for sharing back to relevant social media companies. The goal of this project is to be able to track any product changes and impacts from platforms as well as share feedback to all other relevant tech platforms on a regular basis.

These discussions in these meetings uncovered many issues, but a key concern that many raised was the prevalence of self-censorship on social media by women and minorities as a preventative and protective measure against online harassment and gender-based violence. Harassment, threats, and intimidation particularly target high-profile and outspoken women, often through what appear to be organized campaigns.

Key informant interviews across diverse geographic contexts are reporting that gender-based harassment including hate speech and impersonation on social media is normalizing the practice of self-censorship and forcing many to leave online platforms altogether. While many participants spoke about the need to carry on with their work despite the intimidation, many also recounted closing their social media accounts or using other tactics to limit their exposure.

“Parties were spreading hate speech through posts, especially against women candidates. They posted personalized and explicit, sexual content about the candidates, before and during the election.”

Civil Society and Media Convening Participant, Iraq

Social media and tech platform products promised to be a great equalizer for women and minorities in many communities around the world, providing a platform and access to civic discourse. Unfortunately, harassment remains widespread on social media and increases risks of both online and offline harm for women and LGBTQI+ activists, journalists, and public figures.

Convening participants reported that even online threats made to male journalists and activities will often target female family members with threats of physical violence or rape. Whatever its form, this harassment appears to be designed explicitly to intimidate and silence both its direct targets and other platform users.

“I am getting rape and death threats almost every day.”

Civil Society and Media Convening Participant, India

The most urgent issue that surfaced from more than a dozen civil society convenings is the very limited degree of documentation around online self-censorship in multiple countries. Trying to track and understand this behavior is difficult by nature as it requires self-reporting from online users. The onus is always on the women journalists or the LGBTQI+ activist to ‘block’, ‘mute’, or ‘report’ online attackers as the only defense mechanism. Furthermore, this gap in evidence means that it is difficult to hold platforms accountable for the lack of safety considerations built into the product designs.

“I had to start using a different last name on social media so I wouldn’t be found.”

Civil Society and Media Convening Participant, Brazil

Social media companies have a responsibility to ensure that their products do not sideline women and amplify offline inequities in online spaces. Platforms have more data to better understand the phenomenon of self-censorship from women and minorities that are not being shared externally or perhaps not even being analyzed. For example, data that includes the number of users deleting their accounts or the number of inactive accounts over the past 6 or 12 months before and after an election can be a good starting point.  This data can help identify where additional protections are required for women to participate safely online.

“When we post anything now, we often think first how others would react to our posts. We work with the truth, but it is better to live and work, rather than meet a challenge and disappear.”

Civil Society and Media Convening Participant, Cambodia

To ensure social media platforms become responsible stakeholders in this fight, we need to collect consistent insights on the degree of self-censorship on platforms such as Twitter, Facebook, Instagram, YouTube, and TikTok over time.

We need coordinated efforts to share best practices and insights publicly, and more research needs to be done to understand how to protect women and mitigate the need for self-censorship.

Women and gender diverse stakeholders must be able to participate in the design of platform safety features and reporting mechanisms. Only when platforms can protect the most at-risk users on their products, will they be able to protect our collective information environment.

The platform accountability team at Internews recently launched a public call for proposals to support organizations and/or independent researchers and journalists working on platform accountability challenges in underrepresented communities.


If you or your organization are also working on projects relevant to platform accountability and governance, our team would love to hear from you.