When Algorithms Decide: Filtered Voices from Nepal

Open Knowledge Nepal

Open Knowledge Nepal

 | 

Sat Jan 24 2026

Have you ever wondered why certain posts appear on your feed while others don’t? Why do you see some content repeatedly, while other voices seem to disappear? What decides which videos go viral and which ones never reach beyond a handful of viewers?

Every time you scroll through social media, invisible algorithms are making decisions for you, choosing what you see, what gets hidden, and whose voices get amplified. These automated systems shape our understanding of news, culture, and even reality itself. But do we really know how they work?

Understanding the Invisible Forces Shaping Our Digital Spaces

For content creators, these algorithms can be even more consequential, determining whether their posts reach audiences, whether their accounts get restricted, or whether their content gets removed entirely without a clear explanation. For all of us as consumers, these same systems curate our digital experiences, often in ways we don’t understand or can’t see.

These automated systems, powered by artificial intelligence (AI), make countless decisions every day about what content we see, share, and create on social media platforms. But how do these systems work? Who decides the rules? And more importantly, how do they affect voices from Nepal?

When Algorithms Decide: Filtered Voices from Nepal is a 6-month community-driven project that aims to demystify algorithmic content moderation and document its real impact on Nepali digital creators, activists, journalists, and everyday users.

What We’re Doing

Over the next few months (up to May 2026), we will:

  • Research & Education: Develop accessible Nepali-language explainers and visual materials that break down how algorithmic moderation works, making complex AI systems understandable for everyone.
  • Community Conversations: Host focus group discussions with diverse content creators, including youth, women, and civic activists, to document lived experiences of content takedowns, shadow banning, and algorithmic bias.
  • Multi-Stakeholder Dialogue: Organize a national roundtable that brings together civil society organizations, digital rights advocates, policymakers, and affected communities to connect policy to lived realities.
  • Creative Storytelling: Produce a bilingual (Nepali-English) zine that blends personal narratives with data and policy insights to make our findings accessible and engaging.
  • Public Engagement: Share insights through blogs, social media, and civic tech channels to foster broader conversations about AI accountability and digital rights in Nepal.

Who This Project Is For

This project centers the voices of:

  • Content creators who have experienced unexplained content removals or restrictions
  • Youth and women navigating algorithmic bias on digital platforms
  • Journalists and activists working on civic issues
  • Digital rights advocates seeking evidence-based policy recommendations
  • Everyday social media users curious about how platforms moderate content

Whether you’ve had content removed, experienced account restrictions, or simply want to understand how algorithms shape what you see online, this project is for you.

What We Hope to Achieve

By the end of this project, we aim to:

  • Create the first comprehensive documentation of algorithmic moderation’s impact on Nepali users
  • Build public understanding of how AI-powered content moderation systems work
  • Amplify marginalized voices often silenced by algorithmic bias
  • Identify gaps between platform policies and community experiences
  • Develop actionable policy recommendations for more transparent and accountable content moderation
  • Connect Nepal’s digital rights challenges to global conversations on AI ethics

Supported by the Pulitzer Center

This project is supported by the Pulitzer Center, whose commitment to independent journalism and underrepresented voices enables us to conduct rigorous research while amplifying stories from Nepal’s digital landscape. The Pulitzer Center’s support helps us connect local experiences with global conversations about AI accountability, platform transparency, and digital rights.

Share Your Story: Take Our Survey

We’re starting by listening to you.

If you use social media or have ever experienced content removal, account restrictions, or algorithmic bias on social media platforms, we want to hear your story. Your experiences will help us understand the real-world impact of algorithmic moderation in Nepal and inform our advocacy for more transparent and fair systems.

[Take the Survey] https://forms.gle/7f4qU3D2BisQfc446 

The survey takes approximately 4-5 minutes to complete. All responses will be treated confidentially, and you can choose how much information you wish to share.

Join the Conversation

This is a community project, and we need your voice. Whether you’re a digital creator, journalist, activist, researcher, or simply someone who cares about digital rights in Nepal, there are many ways to participate:

  • Share your experience through our survey
  • Follow our updates on social media
  • Participate in upcoming focus group discussions
  • Attend our national roundtable in March 2026
  • Share this project with others in your network

Together, we can make algorithmic moderation more transparent, accountable, and fair for all voices in Nepal.

opendata