Open Knowledge Nepal
|Sat Jan 24 2026

Have you ever wondered why certain posts appear on your feed while others don’t? Why do you see some content repeatedly, while other voices seem to disappear? What decides which videos go viral and which ones never reach beyond a handful of viewers?
Every time you scroll through social media, invisible algorithms are making decisions for you, choosing what you see, what gets hidden, and whose voices get amplified. These automated systems shape our understanding of news, culture, and even reality itself. But do we really know how they work?
For content creators, these algorithms can be even more consequential, determining whether their posts reach audiences, whether their accounts get restricted, or whether their content gets removed entirely without a clear explanation. For all of us as consumers, these same systems curate our digital experiences, often in ways we don’t understand or can’t see.
These automated systems, powered by artificial intelligence (AI), make countless decisions every day about what content we see, share, and create on social media platforms. But how do these systems work? Who decides the rules? And more importantly, how do they affect voices from Nepal?
When Algorithms Decide: Filtered Voices from Nepal is a 6-month community-driven project that aims to demystify algorithmic content moderation and document its real impact on Nepali digital creators, activists, journalists, and everyday users.
Over the next few months (up to May 2026), we will:
This project centers the voices of:
Whether you’ve had content removed, experienced account restrictions, or simply want to understand how algorithms shape what you see online, this project is for you.
By the end of this project, we aim to:
This project is supported by the Pulitzer Center, whose commitment to independent journalism and underrepresented voices enables us to conduct rigorous research while amplifying stories from Nepal’s digital landscape. The Pulitzer Center’s support helps us connect local experiences with global conversations about AI accountability, platform transparency, and digital rights.

We’re starting by listening to you.
If you use social media or have ever experienced content removal, account restrictions, or algorithmic bias on social media platforms, we want to hear your story. Your experiences will help us understand the real-world impact of algorithmic moderation in Nepal and inform our advocacy for more transparent and fair systems.
[Take the Survey] https://forms.gle/7f4qU3D2BisQfc446
The survey takes approximately 4-5 minutes to complete. All responses will be treated confidentially, and you can choose how much information you wish to share.
This is a community project, and we need your voice. Whether you’re a digital creator, journalist, activist, researcher, or simply someone who cares about digital rights in Nepal, there are many ways to participate:
Together, we can make algorithmic moderation more transparent, accountable, and fair for all voices in Nepal.