Over the last few years, India has witnessed a steady rise in the number of internet users coming online for the first time from semi-urban and rural areas. There’s also been a spike in the number of instances where false information circulated online has resulted in confusion, social disharmony and, at times, even loss of life. During the pandemic, these problems became more glaring than ever as misinformation about COVID-19 spread like wildfire.
Study conducted on behavior re. misinformation
Earlier this year, FactShala was launched by Internews in collaboration with DataLeads and with the support from Google.org and the Google News Initiative. As we began this news and information literacy programme designed to help India’s new internet population learn how to navigate the maze of online misinformation, we wanted to understand why people fall for misinformation, what signals they rely on to evaluate online sources and what makes this population susceptible to misinformation.
To answer these questions, team FactShala conducted a pilot study in consultation with the Stanford History Education Group (SHEG) and analysed a total of 1955 responses from 391 respondents coming from over 90 cities and 25 villages in 23 states in India.
The respondents were given tasks that mirrored the most commonly found forms of online messages and information. They were then asked to share in detail whether they would trust the information or message and to explain their rationales. The idea was to understand the types of challenges people face when they see information online. Most respondents answered in writing, but many answers were recorded as video or audio files.
Findings indicate a need for fact checking strategies
An analysis of the responses revealed a stark contrast between the strategies used by professional fact-checkers and those used by respondents while assessing online content, and pointed to some clear patterns about why people fall for misinformation. One of the biggest patterns we found was that respondents — across cities and villages — put an overarching focus on content rather than the source and the majority of them did not question if the source was a credible authority on the subject of information being shared.
Our analysis also showed that respondents relied primarily on personal biases and beliefs while deciding whether they would believe or reject a message and that they usually did not verify information from alternative sources on the web. Even in cases where respondents were noticeably skeptical about social media, they struggled to verify information as their skills to evaluate evidence were rudimentary.
“In the midst of a flood of disinformation about COVID, the FactShala user study provides a timely portrait of how people across India evaluate online information,” noted Joel Breakstone, director of the Stanford History Education Group. “This study’s insights will inform the critically important work that FactShala is undertaking to prepare individuals to sort fact from fiction online.”
Read the full FactShala report
A curriculum is developed for teaching media literacy
Insights from the user study helped Factshala create a curriculum for a core group of its 250 trainers — journalists, media educators, fact-checkers, non-profit workers and community radio stations — coming from almost all states of India and selected from hundreds of applicants. These trainers are now helping adults over the age of 18 in Tier 2 and 3 cities and villages in India understand their information ecosystems better, identify misinformation and locate trustworthy sources of information.
The curriculum modules, built in consultation with experts from the Amity University, BBC, BoomLive, Don Bosco University, IIJNM and Hong Kong University and the Stanford History Education Group, focus on information neighborhoods, source verification and critical thinking. The curriculum is designed to make the program audience aware of the changing media landscape and the key differences between traditional and social media, explains the difference of nature between news versus non-news content and trains them on the basics of verification techniques to help them evaluate the evidence shared in online posts. Curriculum modules also ask participants to reflect on personal biases so that they can move from passive to mindful consumption of information.
The curriculum has been translated into seven Indian languages and is now being used by FactShala trainers who are organising online and offline trainings in cities and villages across India. In languages of their choosing, tea garden workers, urban slum dwellers, child rights activists, LGBT communities, rural teachers, and women’s rights groups are learning how to understand their information ecosystems better and evaluate online information critically.
Read the full FactShala report
(Banner photo: Tea garden workers from West Bengal attend a FactShala session. Credit Internews)