Countering Health Misinformation in the Digital Age
Introduction
During the COVID-19 pandemic, I first saw a message in a family WhatsApp group: “Drink turmeric water daily and you’ll be protected from COVID-19.” Within hours, it had travelled across neighbours, colleagues, and even students. While seemingly harmless, some people believed it, delaying vaccination in favour of forwarded advice.
For me, this was more than just another “fake news” forward. It was unsettling because it came from people I trusted—an aunt who always cared for my health, a neighbour who once helped when I was unwell. That moment made me realise misinformation doesn’t come dressed as a villain; it often comes wearing the face of care and concern. And because it feels familiar, it spreads with little resistance.
That personal experience highlighted for me a critical issue in the digital age: misinformation silently erodes trust in evidence-based healthcare (EBHC).
Background
The internet and social media have transformed access to information. Tools like Johns Hopkins’ COVID-19 dashboard and India’s Aarogya Setu app became lifelines during the pandemic, helping millions track data and exposure in real time.
But alongside these powerful tools, rumours flourished. I remember students logging into my online classes, brimming with questions not about the dashboard I was teaching, but about rumours they had read the night before: “Ma’am, is it true the vaccine changes your DNA?” “Will people get infertility after the second dose?”
These weren’t silly questions. They came from genuine fear, reinforced by the speed at which misinformation moved. The balance was off: science was precise but slow; misinformation was emotional and fast. Research echoes this reality: a Nature Human Behaviour study found that one in five COVID-related YouTube videos contained misinformation, and MIT researchers showed that false news spreads six times faster than truth on Twitter.

The Problem
Facts alone rarely convince. I experienced this firsthand during a vaccination workshop in Rajasthan.
I stood in front of a group of health workers, projecting a dashboard full of coverage rates, safety reports, and post-vaccination monitoring. I expected nods of confidence. Instead, I saw raised eyebrows and quiet conversations. When I invited questions, one woman finally asked: “Madam, these numbers are fine. But in our village, three women had fever for two days after taking the vaccine. How do I explain that to people?”
Her honesty struck me. I realised that my carefully curated evidence didn’t address her lived reality. She needed a bridge between numbers and human stories. The problem wasn’t data—it was resonance.
The Search for Answers
Strategies to counter misinformation must integrate digital tools with human connection. Global campaigns teach us much, but I’ve also learned through my own teaching and fieldwork that the turning point is when evidence feels personal.
• Ebola in West Africa (2014–16): Flyers didn’t change burial practices. But when local storytellers and theatre groups explained safe practices through familiar rituals, communities began to adapt.
• Polio in India: Families ignored medical flyers but listened to Bollywood actors and local religious leaders. The messenger mattered as much as the message.
• In my classroom: I once explained an electronic health record by telling the story of a diabetic woman who missed her insulin because her data wasn’t recorded properly. Suddenly, what seemed like a dry technical tool became a life-saving system in my students’ eyes.
The lesson I carry with me is simple: evidence becomes powerful when it is wrapped in human stories people recognise.
Result/Outcome
Effective misinformation countering occurs when evidence is translated into relatable stories and delivered by trusted voices.
• The WHO partnered with Facebook and Twitter to redirect users to verified information hubs.
• UNICEF used infographics and short videos to explain vaccines, cutting through jargon with colour and clarity.
• In Kerala, India, ASHA workers went door-to-door, answering questions with patience. I once shadowed an ASHA who spent 45 minutes with one hesitant family, listening more than talking. By the end, the family agreed to vaccination—not because of data sheets, but because someone they trusted sat with them.
In my teaching, I’ve started encouraging students to “translate” data for specific audiences—imagine explaining vaccine efficacy not to a policymaker, but to their grandmother. The results are always striking: they strip away jargon, they lean on empathy, and the evidence comes alive.
Challenges/Obstacles/Lessons Learned
Common pitfalls include:
• Overloading audiences with jargon, leaving gaps filled by misinformation.
• One-way communication that talks at people rather than engaging them.
• Blind reliance on algorithms; AI can detect fake news but often misses cultural and contextual nuances.
Lessons learned:
• From my Rajasthan workshop: dashboards alone don’t convince; stories do.
• From my teaching: evidence becomes memorable only when learners see themselves in it.
• From my personal WhatsApp experience: misinformation feels trustworthy because it comes through trusted networks—countering it means creating trust in evidence within those same networks.
• From global campaigns: human connections remain central in fostering trust.
Next Steps
To counter misinformation effectively:
1. Human + Digital Together: Use AI and apps to amplify messages but rely on trusted community voices—health workers, teachers, even family members—to build trust.
2. Simplify Without Oversimplifying: Employ stories, analogies, and visuals to make evidence stick while retaining accuracy.
3. Collaboration in Action, Not Just Words: Researchers must provide clarity, policymakers must amplify reach, and communities must contextualise. In practice, this means dashboards designed by scientists explained by ASHAs in local dialects, and decisions shaped by community feedback loops.
Key Take-Home Messages
• Misinformation spreads faster than truth, especially on social media.
• Facts alone are insufficient; evidence must connect emotionally and culturally.
• Digital tools amplify information, but human trust remains essential.
• Storytelling and local voices are more effective than technical data alone.
• Collaboration only works when each actor brings their strength into play—researchers, policymakers, and communities working together at the same table.
References & Resources
1. WHO. Fighting misinformation in the time of COVID-19. 2020.
2. Vosoughi S, Roy D, Aral S. The spread of true and false news online. Science, 2018.
3. Nagpal N, et al. COVID-19 misinformation and its impact on public health. Nature Human Behaviour, 2020.
4. UNICEF. Storytelling and visuals for vaccine communication. 2021.
5. Government of India. Polio eradication campaign reports, Ministry of Health and Family Welfare.
To link to this article - DOI: https://doi.org/10.70253/XQLM7939
Disclaimer
The views expressed in this World EBHC Day Blog, as well as any errors or omissions, are the sole responsibility of the author and do not represent the views of the World EBHC Day Steering Committee, Official Partners or Sponsors; nor does it imply endorsement by the aforementioned parties.