SplinterCon 2025, Paris: A Recap

Surveillance and censorship technologies are evolving, but so are we.
In December 2025, OpenArchive joined technologists, researchers, and policymakers at SplinterCon Paris to attempt to answer the question: how can digital sovereignty serve resilience without enforcing isolation?
SplinterCon takes its name from the term “splinternet”: the splintering of the internet in multiple, fragmented networks caused by state shutdowns and censorship technologies. With splinternets come resistance and circumvention: emerging technologies have helped people break out of isolated networks and enabled resilient communications for users inside those networks. SplinterCon brings together practitioners who question the evolution of telecommunication networks, study censorship and shutdown strategies, and build technical responses.
Despite what feels like an increasingly ‘splintered’ internet, this year's edition provided some optimism. For each talk about censorship and shutdowns, there were twice as many presentations of tools and technologies that counter these threats. Save (Secure, Archive, Verify, Encrypt), our mobile archiving app that helps people preserve their media for the long term, is one of those technologies. While we joined SplinterCon to present it and share our recent research and development work around opportunities for preserving mobile media on decentralized storage networks, this was also an opportunity to learn and reflect on emerging best practices as an organisation building censorship-resistant tools.
All in all, we drew three lessons from what we saw at SplinterCon 2025:
- Big tech companies can become a liability;
- Surveillance and censorship technologies are evolving;
- Adoption of new technologies remains challenging.
Big tech companies can become a liability for at-risk users
Many presentations and talks this year focused on the evolution of Russia's internet (RuNet) since the beginning of the full-scale invasion of Ukraine in 2022. The observations were dire: while Russia's internet was almost entirely open 10 years ago, and people could not conceive that some websites would be one day blocked, the situation has drastically changed in the past three years, to a point where major services such as WhatsApp and Facebook are now blocked (Meta was actually designated as an extremist organisation by Russian authorities in March 2022). When not totally blocked, targeted websites and services can be throttled, as is the case with Youtube and Twitter/X, making access difficult and frustrating for users.
This change in behaviour is a cold reminder that major tech companies can quickly become targets for an authoritarian government. If you and your community rely heavily on centralized services provided by companies like Meta, Google or Apple, you should consider how they can become a liability for at-risk users and rights-defenders when they become unwanted in your country.
Even if they are not a target of censorship, Big Tech platforms can present risks. When they don’t outright collaborate (technically and financially) with their home country, they might operate under strict conditions. For example, security for Apple users in China is different from the rest of the world, with services accessed by Chinese citizens hosted on servers in the country which the government can access. This can create risks for users in some countries that don't exist for users in other countries. Sometimes, those Big Tech services remain the best alternative for those who have no other option to stay connected with their community. But the situation in Russia is a reminder that keeping your eggs in the same basket and trusting a Big Tech provider with your data, communications and contacts can bite you back.
OpenArchive knows this well. That's why our Save app relies either on publicly accessible archives, like the Internet Archive, or FLOSS private servers that can be easily deployed and are much harder for governments to target and block.
Surveillance and censorship technologies are evolving
The evolution of the RuNet was not done in one day. It is the result of a gradual deployment of technology enabling fine-grained control over the network. In particular, we learned about TPSU, black boxes deployed through internet service providers (ISPs) that offer different types of blocking mechanisms. This technology allows for targeted shutdowns, cutting internet access only in certain parts of the country, advanced blocking, and opens the door to targeted surveillance. Particularly concerning was the report that all forms of tunnelling (a technical solution to conceal traffic, used notably by VPNs) could be blocked blindly, meaning that no matter what tunneling technology you use (TOR, Wireguard, OpenVPN, or others), it could be blocked. This finding suggests that the government has developed and deployed a protocol agnostic solution to detect and stop technologies that conceal traffic.
Other countries rely on more traditional methods of blocking. A session on Iran's internet control exposed how DNS blocking remains the solution of choice for the country, a method also used by France to prevent access to certain sites. In both cases, ISPs are responsible for enforcing the blocking, and such enforcement can be limited. For example with France, small-scale ISPs were not required to enforce the blocking of specific sites, and were not provided with a blocklist. In Iran, a large-scale study of access to content from different parts of the country showed that a non-negligible percentage of requests to view blocked content were still connecting, without the need for a VPN.
In the domain of advanced censorship and surveillance tech, new research exposed how China exports its Great Firewall technology to authoritarian countries. The research parsed over leaked documents from Geedge Networks and detailed the various technologies and techniques sold to countries such as Kazakhstan, Ethiopia, Pakistan and Myanmar. In addition to revealing how these solutions are technically implemented, this research also gave a glimpse of China's strategy on exporting its vision of internet control.
Adopting new technologies is hard
Where SplinterCon shines is through the myriad of tools presented that demonstrate innovative, smart solutions to circumvent or evaluate censorship and monitoring. From plugins on top of Signal to create helpdesk or tipline to frameworks for mapping influence operations to Matrix-based social media to decentralised messaging apps, it was inspiring to see how much effort is put into enabling connectivity even in the most challenging conditions.
Yet, one thing remains a constant for all these tools: adoption and retention are hard. From spreading awareness about a technology, to being able to use it easily and effectively, there are many gaps into which users might fall, preventing people from enjoying the benefits of these technologies.
A presentation of findings from user-focused research in Iran was particularly interesting in that regard. It highlighted what barriers users face when trying to use a new piece of circumvention technology such as a new social media, a VPN, or a browser. The key highlights were enlightening: 1) contact beats information, people prioritise connection with their loved ones over information about on-going events, 2) crisis erases capacity and, 3) no one uses documentation. The presentation also highlighted the key role played by enablers and more tech-capable users who could support others in adopting a new technology.
Those findings were incredibly important to us. They echo our own research that has shown that if we don't prioritize usability as much as, or more than, privacy, security, and circumvention, no one will use new tools. Founded and created by a human rights archivist from within the communities we serve, our mission is to meet people where they are and co-build technology that can be easily adopted to support people who are preserving evidentiary media. That is why we work with others to design technology that is intuitive, secure, and follows the Human Rights Centered Design Methodology.
Fundamentally, this kind of event reminds us about the importance of collaboration. When working to create connectivity and networks in spaces with heavy censorship and surveillance, sharing what we know, what we do, and how we do it matters. This is a principle we have embraced for the past ten years, making not only our app, but also our research findings and resources, free and open source, so that others can learn from, evolve, and expand upon the growing corpus of human rights tech. Surveillance and censorship technologies might be evolving and becoming ever more effective but so are we.