DSA: Europe's Injustice System
EU's DSA promised digital rule of law. In Romania, it empowered an opaque web of government agencies, NGOs, and foreign actors to censor critical voices—without checks, transparency, or recourse.
Hot Takes:
• Censorship chapters of the DSA in action, not the ones protecting users rights
• DSA transposition into local law used to expand local censorship powers
• Romanian National Audiovisual Council meddling in Moldova’s elections
• Rapid Response System (RRS) quietly activated during election times
• RRS flaggers never announced (Romania, Czech Republic)
• RRS secrecy dictated by the European Commission
• Expert Forum: flagger with ties to executive powers
• Fact-checks that are themselves misinformation
• Freedom House influence combined with fact-checker status
• Missing Statements of Reasons in the DSA database
• Wiesel Institute stretching Holocaust mandate into politics
• No Romanian Out-Of-Court Settlement Body — no clear path to redress
• French company moderating TikTok content in Romanian
• Censorship targeting conservatives, Gaza coverage, and even satire
• Platforms ignoring formal requests for transparency
• Only one example of best practices on transparency
• ANCOM (Romania’s DSC) stalling and shirking enforcement
• Platforms (Facebook, TikTok, X) should make full discloser of moderation requests
This report was authored by Stéphane Luçon and Patrick-André de Hillerin. No part of this publication may be reproduced, in English or in any other language, without the explicit consent of the authors.
Request of Information & Right of Reply: stephane.lucon()mondiplomatique(.)ro
Version 1.0.2 - Sept 30, 2025 - © Luçon & de Hillerin – All rights reserved
The Digital Services Act was supposed to bring the rule of law to the digital realm. Like any functioning governance system, it promised transparency, accountability, and should offer checks and balances. Citizens would know why content was removed, have the right to appeal, and access redress mechanisms when wronged. The European Parliament fought for these protections, reinforcing the original European Commission proposal with more safeguards for fundamental rights.
At its core, the rule of law rests on the separation of powers: judges don’t prosecute, prosecutors don’t deliver verdicts, and appeals are handled by independent bodies. Transparency ensures that citizens understand the decisions that affect them. Due process guarantees that no punishment is handed down without a fair hearing. Even if not offering a real parallel to the justice system - the only one who can settle cases before the law - the text of the DSA was mimicking an intention of justice by ensuring user rights that were missing before.
Instead, what has emerged in Romania is a system where the same actors wear multiple hats, conflicts of interest abound, and transparency exists only on paper. Independent journalists are censored without warning, explanation, or appeal. Institutions entrusted with protecting users now sidestep Romania’s own constitutional guarantees: freedom of expression (Article 30) and the right to information (Article 31) are violated in plain sight.
This investigation shows how the DSA, far from securing rights, can be hijacked to serve power. Romania’s case is not an exception—it’s a warning. A canary in the coal mine for Europe and the rest of the world.
As you will see, in the third chapter, too many instances of abusive censorship occurred to be just accidents.
From the evidence, two conclusions naturally emerge:
1) there must be at least one bad actor that targets independent voices, may it be on behalf of power or of some interests,
2) the issue reached systemic proportion.
It led us to wonder, for every single participant of the DSA architecture, whether they had a history of controversies or conflicts of interest, whether they were transparent about their funding and moderation activities, and whether they benefited from executive power or held partisan positions.
We also wondered if they had other incentives to misuse their role—for instance, content producers flagging competitors—how they qualified for these powers in the first place, and whether, when confronted, they explained their actions or chose to hide them.
Finally, we wanted to know what reforms could prevent such abuses and restore trust in the DSA.
The DSA's Architecture of Power
The Digital Services Act establishes a carefully layered ecosystem of actors, each with distinct roles and responsibilities—at least on paper.
Digital Services Coordinators (DSCs):
National regulators charged with supervising platforms and enforcing the DSA. DSCs are meant to act as neutral overseers and points of contact for systemic complaints.
Romanian DSC: ANCOM (the National Authority for Management and Regulation in Communications).
Very Large Online Platforms (VLOPs):
Tech giants providing the public square where users can interact, create content, consume content. The VLOPs are responsible for enforcing their terms of service, publishing moderation transparency reports, and responding to trusted flaggers and user complaints… Most have a form of partnership with Fact-Checkers to theoretically address false information.
Most important VLOPs in Romania: Meta (Facebook), Bitdance (TikTok), Alphabet (YouTube), X (formerly named Twitter - X mostly used for interactions with Foreign public persons and audiences)
Trusted Flaggers:
Select organizations granted fast-track status when reporting illegal content. Their reports are supposed to be “treated without undue delay,” effectively giving them an escalated censorship pipeline
Romania has 2 Trusted Flaggers: The Wiesel Institute for Holocaust Remembrance and FIghting Antisemitism, Save the Children (Salvati Copii)
Fact-Checkers:
Independent third parties tasked with evaluating the accuracy of content. Their assessments can lead to deamplification, labeling, or removal—even in the absence of legal violations.
According to the European Digital Media Observatory (EDMO), Romania has 4 fact-checkers: Funky Citizen (via its division “Factual”), AFP Verificat (part of AFP Fact-Check), Freedom House Romania, Eurocomunicare.
We also unveiled the participation of the US company Lead Stories as TikTok fact-checker.
Signatories to the Code of Practice on Disinformation:
Voluntary participants—including platforms, NGOs, and fact-checkers—who commit to certain transparency and anti-disinformation standards, including publishing annual reports on their actions.
Romanian signatories: Expert Forum, a direct signatory, and the members of the EDMO mentioned as fact-checker aforementioned.
Rapid Response System (RRS) Participants:
Temporarily elevated actors granted Trusted Flagger-like powers during crises, including election periods. Their role is to act quickly—but this very speed can bypass proper analysis.
Romanian participants to RRS: Expert Forum, Funky Citizens.
Disclaimer: as we publish, nobody in the moderation architecture could confirm the complete list of RRS participants.
Out-of-Court Settlement Bodies (OCSBs):
Supposedly independent arbitrators, these entities exist to mediate between users and platforms in cases of moderation disputes, offering a legal alternative to costly litigation.
A list of OCSB is provide by the European Commission, only one mentioning the ability to treat complaints in Romanian.
Other Institutional Actors:
Certain state bodies with a legacy in monitoring or censoring content can formally request content moderation during election cycles or when audiovisual standards are at stake.
Major Romanian Institution involved in moderation: Romania’s CNA (National Audiovisual Council) and BEC (Central Electoral Bureau). According to a few persons we interviewed, it is highly likely that the ministry of internal affairs and the intelligence services do have the possibility to directly request content removal too.
Moderation teams:
Platform carry their own moderation and have their moderation teams reviewing the content reported under the architecture previously described, or simply spotted by users or by some algorithm. Interestingly, TikTok even outsources part of the moderation to a French owned company in Brasov.
On paper, this looks like a comprehensive digital justice system. Each actor has a clearly defined lane. Moderation is supposed to be targeted, proportionate, and legally bounded. And yet—what Romania’s implementation reveals is something completely different: a system where these lines blur, roles overlap, and the checks vanish just when they matter most.
When Judges Become Prosecutors
The Problem of Role Redundancy & Conflicts of Interest
On paper, the DSA creates a balanced ecosystem: platforms, flaggers, fact-checkers, regulators, and settlement bodies, each in its lane. The promise is that no single actor would accumulate unchecked power.
In practice, Romania shows what happens when the same organizations wear too many hats at once. Fact-checkers also become flaggers. NGOs that produce content also judge competitors’ content. Advocacy groups demanding more powers also exercise those powers in secret. Instead of checks and balances, the system creates concentrated influence with no accountability.
At the center of this distortion are the Very Large Online Platforms (VLOPs). Under the DSA they are effectively deputized as delegated censors, held legally responsible for moderation choices they often make under pressure—or quiet instruction—from government-linked NGOs. If a VLOP ignores a flagged complaint, it risks massive fines. For the user, this means censorship decisions are almost automatic: posts vanish without serious explanation, appeals go nowhere, and the official transparency database hides more than it reveals. A journalist or citizen never learns who really triggered the takedown—whether a trusted flagger, a Rapid Response System participant, a state agency, or a subcontracted moderation team.
The safeguards that should protect users are aimed at the wrong target. Article 21 (out-of-court settlement) and Article 53 (systemic complaints) let you challenge platforms, but never the source of censorship. The actors who initiate takedowns—trusted flaggers, Rapid Response participants, politically connected NGOs—remain shielded from scrutiny.
This investigation has been ongoing since June, originally launched to prepare for meetings between the editorial staff of Public News and key Romanian actors—both the censored and the censors. Some of the early information requests were submitted by Czech journalist Cecilie Jílková, a journalist specializing in digital rights. Our initial goal was to push ANCOM, Romania’s Digital Services Coordinator, to explain what had happened. When its lack of seriousness became blatant, we continued pressing, hoping ANCOM would acknowledge the necessity of acting ex officio—given the sheer number of journalists and independent voices censored in recent months. Instead, the exchange with ANCOM turned into a months-long correspondence, in which the very authority tasked with enforcing the DSA appeared hesitant to rise to its responsibilities.
After two months without meaningful progress, co-author Stéphane Luçon filed formal Article 53 complaint pointing to censorship of journalists and independent voices on Facebook, TikTok, and X. Not because these platforms were necessarily making censorship calls, but because they are the only actors the DSA allows users to confront. The complaint is designed to force an answer to the most basic due-process questions: Who made the call? When? On what grounds? At whose request? Why was a post removed—or an entire account deleted? It is also meant to get the ANCOM to address the systemic risk represented by a moderation architecture targeting critical voices and serving the interest of the executive power, or of part of it.
Indeed, many of the takedowns appear to stem from politically motivated flagging campaigns, that could possibly be coordinated with Romania’s executive branch. The complaint is now expected to escalate to Ireland’s DSC, which supervises most major platforms in Europe—that is, if ANCOM properly forwards it, as it is required to do according to the DSA.
This is the lens through which the Romanian case must be read. The following chapter reviews each participant we could identify in Romania’s moderation architecture—trying to assess for each of them a series of criteria that we used to spot good practices and bad practices.
Criterias to analyse the participants to the moderation architecture:
Secrecy: from 0 (100% of the funding is identified with a clear breakdown) to 1 (no budget provided)
Unaccountability: from 0 (very reactive in engaging us) to 1 (not answering/adressing raised issues)
Role Accumulation: from 1 (a single role) to n (5 is the max, reached by Expert Forum, next is Funky Citizens with 4 hats)
Dependence: from 0 (financed by members or users) to 1 (total dependency on power structures)
Partisanship: from 0 (totally non-partisan with safeguards) to 1 (partisan/propagandistic)
Power: from 0 (hardly impactfull) to 1 (shapes the flow of information)The compounded note (addition) would then raise from 1 (single role, no issues) to potentially 10 or more according to the number of “hats” a player accumulates.
Romania’s DSA Implementation: a book of worst practices

ANCOM
Romania’s Digital Services Coordinator
ANCOM holds all the powers the DSA assigns to such bodies. It: supervises compliance with DSA obligations; Decides which organizations receive Trusted Flagger status; Processes Article 53 complaints; Coordinates with other national authorities on content moderation (details still pending via FOI request).
All of this is standard under the DSA. But the strategic significance of controlling such an institution should raise serious eyebrows—especially in a country like Romania.
Because ANCOM isn’t just any regulator. It has a long and notorious history of infamous governance, politicization, and entanglement with the intelligence services. For years, it’s been a parking lot for sinecures, nepotistic appointments, and operatives from the so-called “parallel state.” It’s been also infamous for burning through the vast revenues from airwave licensing on some of the highest salaries in the Romanian public sector.
In short, on paper, ANCOM fits the profile of the least suitable institution to be entrusted with the powerful role of Digital Services Coordinator. Nevertheless, “on the bright side”, ANCOM is not doing any form of moderation. It can only ask questions, and their perception is that the platforms aren’t even responding anymore.
Whereas ANCOM generally failed to raise to the stakes of the risks we exposed in our written requests for information, former Member of Parliament and current Vice-President of ANCOM, Mr Pavel Popescu, showed a clear willingness to engage and point misunderstanding in the public sphere and actual challenges the ANCOM faces.
On one hand, many believe that ANCOM has censorship power, which it has not under the DSA architecture. On the other, as an enforcer, ANCOM needs to push for transparency from the platforms, and asks when the users submit complaints. Per his testimony, the platforms are less and less willing to provide answers, and important requests, such as the possibility to know, namely, who is participating to the moderation teams of the VLOPs, haven’t been granted by the VLOPs.
According to Mr Popescu, the “VLOPs are complaining against the DSA, but they aren’t even enforcing their own user agreements, which should be sufficient to grant transparency, appeal and redress for the users, no matter what the DSA requests are. They should be the ones to provide answers, they are literally the Gods of this arena. We would like all platforms to commit to full transparency by making public every debate and decision process over a specific post. It should always be clear who requested the removal: was it the platform itself, the public, a flagger, or an authority.”
CNA
Flagging and take down orders
The National Audiovisual Council (CNA) was originally Romania’s broadcast media regulator. Under the national transposition of the Digital Services Act (DSA) into law, its powers have quietly expanded into the digital domain. Beside its historical focus on regulated media (TV, Radio), it now regulates all video content with sanctioning authority; flags online material directly to platforms; and submits formal takedown requests to Very Large Online Platforms (VLOPs).
More disturbingly, CNA has begun extending its influence beyond Romania’s borders. In September 2, 2025, a member of the CNA council tried to force getting the CNA to “adopt” a report by Expert Forum—an NGO described later in this report, closely aligned with executive powers, both western and Moldova—alleging “coordinated inauthentic behavior” during Moldova’s electoral campaign. But Moldova is not in the EU, the DSA doesn’t apply, and CNA has no jurisdiction outside Romnia. The content has been flagged, without request for take down. This incident set a troubling precedent: a Romanian regulator recycling NGO analysis to justify cross-border censorship. Two weeks later, on September 16th, the same thing happened again.
In terms of independence, CNA’s 11 members are appointed entirely through political channels, with no requirement for media or legal expertise. The acting president is a cellist and former political adviser. This is a body now issuing moderation orders to global tech platforms. Nevertheless, the collegial structure of the CNA makes it the less partisan participant of the current DSA landscape in Romania.
During the period of 12 months, CNA has emited over 374 decisions, covering much more posts. All of this directly contradicts Article 30 of the Romanian Constitution, which plainly states: censorship of any kind is prohibited.
As emphasized by Georgică Severin, member of the CNA council who opposed the introduction of the Expert Forum reports, the fundamental issue with the CNA’s expanded attributions during the transposition of the DSA into Romanian law was the breach of its original boundaries of competence. The CNA shifted from focusing strictly on “professional content” to addressing virtually anyone posting anything on the internet and finding an audience. The line can no longer be drawn between those who make a living from content production and ordinary content consumers. What was once an audiovisual watchdog overseeing professional TV and radio content is drifting into Orwellian activities, targeting any individual who shares an opinion online.
On the other hand, CNA’s processes are some of the best in terms of transparency. Each session is a public debate among board members, each decision is recorded and published. There is room for improvement, technically – we had to develop a software tool the scrap and get an AI to summarize their decisions – but it remains the best and only system to track censorship decisions so far.
Expert Forum
Partisan Think-Tank, Advocacy, Rapid Response System Flagger, Executive Power hires… and Journalists?
Expert Forum presents itself as a think tank dedicated to public policy and governance reform. Under the pre-existing Code of Practice on Disinformation—now folded into the DSA—they gain temporary priority flagging powers during election times under the Rapid Response System mechanism. Yet although they’ve produced influential reports that shaped the narrative around Romania’s 2024 election annulment, they remain opaque about their own activities.
When asked to provide transparency about their censorship and moderation actions, Expert Forum does not respond. They didn’t even publish the reports required under the Code of Practice on “disinfocode.eu”, the official website of the signatories, until Sept. 2025 (a report has been announced—not available at publishing time). They offer no clarity on how much content they flag, nor how their practices compare to those of public institutions like the CNA.
Their public statements and participation in open letters raise further concerns. On December 4th, 2024, they were among the NGOs calling for the declassification of Supreme Council of National Defense (CSAT) information. While “transparency” sounds good in principle, in U.S. practice, such declassification campaigns have sometimes enabled executive overreach without due process. That is exactly what happened in Romania: the declassification leaked private information and triggered an extraordinary—and unconstitutional—intervention by the Constitutional Court, which acted ex officio (outside its mandate) and annulled an election it had already validated, even though the published reports weren’t demonstrating fraud, let alone an actual massive Russian influence able to “convert” millions.
Two months later, Expert Forum signed a second open letter demanding “radical transparency.” Which might sound a good idea—until you read its content: it calls for enhanced access for experts and does not mention transparency for users even once. This is a clear pattern. They advocate selective transparency, one that reinforces the authority of expert networks, while refusing to meet even the minimum standards required by the Code of Practice they themselves have signed.
Yet when asked why Expert Forum was included in the Rapid Response System, ANCOM simply replied that they are a signatory—as though that alone justified a censorship role.
Transparency is for thee, not for me
In 2025, Expert Forum authored a report alleging “coordinated inauthentic behavior” during Moldova’s election campaign. But Moldova is not a member of the EU, and the DSA has no jurisdiction there. Nonetheless, the Romanian CNA discussed adopting the report in full, effectively using it to justify moderation action and narrative control in a foreign election. This incident marked a dangerous precedent: a Romanian regulator leveraging the analysis of a politically embedded NGO to interfere in another country’s internal affairs—under the pretext of moderation.
The cross-border entanglement doesn’t stop there. In July 2025, Expert Forum’s executive director was appointed to Moldova’s Prosecutorial Vetting Commission, impersonating the organization’s close relationship to another country’s judicial architecture. This only deepens the blurring of lines between NGO activity, foreign policy, and ideological enforcement.
Finally, Expert Forum’s funding remains completely opaque. They list donors like the National Endowment for Democracy (NED), the U.S. Embassy, Open Society Foundations, various EU executive agencies, and even oil industry players—in what can certainly seem as a real who’s who of Deep State partisan meddling. Yet they offer no breakdown, making it impossible to assess their financial dependencies or potential conflicts of interest.
In short, Expert Forum is not just another NGO. It is a central node in Romania’s censorship architecture—an actor that speaks in the name of civil society while executing and legitimizing state-aligned moderation. Unless future investigations prove otherwise, they stand as a key architect of the region’s emerging system of controlled speech.
According to Freedom House’s CEO, Cristina Guseth: “Expert Forum is the most powerful organization at the moment.” Other participants also noted that Mr. Ionita, Head of Expert Forum, occasionally acts as a journalist — blurring further the lines between advocacy, censorship, and media.
Funky Citizens
Fact Checker, Rapid Response System Flagger, Content producer, Advocacy group
Funky Citizens style themselves as a dynamic NGO “halfway between good governance and active citizenship,” mobilizing so-called “civically fit” citizens through data-driven projects in fact-checking, media literacy, youth engagement, and public-sector accountability. They run platforms like Factual.ro (fact-checking site) and Buletin de București (news content), and proudly display their compliance badge from the International Fact‑Checking Network’s Code of Principles—branding themselves as credible, nonpartisan actors.
Behind the sleek branding, however, lie nearly as many red flags as with Expert Forum.
Their most controversial role is as a fact-checking partner of Meta, where their assessments can trigger de-ranking, visibility reduction, or outright removal of posts across Facebook and Instagram. Yet their moderation track record includes selective enforcement (they don’t fact-check lies about sovereigntist candidates) and abusive rulings—some shaped by overt ideological framing, others built on laughably thin foundations. (Read: “Zuckerberg’s Funky Romanian Consent Factory.”) In one notable case, after being challenged over a misleading “false information” label, Funky Citizens refused to revise the ruling—merely correcting a convenient typo that had made their original fact-check even more misleading (see in the next section: “Why the DSA Fails at Transparency, Appeal, and Redress”).
The most concerning part of the fact-checking business revealed by our inquiries with Funky Citizens (and other fact-checkers under the Meta contract, like Demagog, active in Czech Republic, but also AFP, active in Romania and Czech Republic and a long series of other countries), is that this business rewards quantity over quality. That’s how we got Funky Citizen to fact-check an ironic, hyperbolic claim made by an influencer on Facebook, saying that the recent harsher speech law would put a target for prosecution on all the voters of right-wing populists.
Even more troubling is their participation in the Rapid Response System, where they act as a content flagger during sensitive election periods. ANCOM’s official justification for bringing Expert Forum into the RRS didn’t apply to Funky Citizens—they don’t appear to the list of signatories of the Code of Conduct on Disinformation, but are actually members of EDMO (European Digital Monitoring Observatory), who is a signatory, therefore granting its members access to the Rapid Response System…
A content producer, fact-checker, policy lobbyist, and platform flagger, Funky Citizens wears more hats than a proper logic of checks and balance would allow. And its CEO is now announced to become president of of the Economic and Monetary Union and Economic and Social Cohesion section of European Economic and Social Committee.
For the record, we got an answer from Funky stating that they’d be willing to waive the Non-Diclosure agreement with Meta, if Meta agrees. We salute this statement that would be a major step forward toward transparency and understanding how things went so wrong that jokes where fact-checked and demoted under this contract. And we salute Funky Citizens’ constant willingness to engage with us.
Lead Stories
Algorithmic detection and fact-check
In the fact-checking ecosystem of Eastern Europe, the American company Lead Stories plays a prominent role. Partnering with TikTok, it relies on its proprietary tool Trendolizer to detect viral content. The company then issues fact-checks that can result in posts being de-ranked or removed. But because its process begins with trend detection, we believe that their is a risk the Lead Stories functions more like a real-time content flagger than a traditional fact-checker—quietly shaping the visibility of narratives as soon as they gain traction.*
Lead Stories insists that its clients do not influence what it fact-checks, and it highlights its membership in both the International Fact-Checking Network (IFCN) and the European Fact-Checking Standards Network (EFCSN). However, as the cases of Funky Citizens and Expert Forum demonstrate, these badges offer little assurance of rigorous oversight or independence.
Trendolizer, on its own, may be a legitimate product that can be licensed to any actor in the market. According to Lead Stories, Meta and ByteDance have never used nor licensed it, neither did the TikTok’s outsourced moderation private operator, Majorel/Teleperformance, and no client accounts for more than 0.15% of revenue. Lead Stories further stresses that it does not have the power to issue takedown requests and is “a journalistic outlet collecting and publishing information.”
What remains crucial, however, is that Lead Stories appears to be the only fact-checker working with TikTok in Romania—a platform that has become increasingly strategic in the information war, particularly since the blitzkrieg-style rise of Călin Georgescu a year ago and the now-common belief that “TikTok makes presidents.”
*This is version 1.0.2 of the report. We are in ongoing discussions with Lead Stories to clarify the information we need and will continue to review this part of the report. A previous version of the report stated that Meta and TikTok were using Trendolizer, which Lead Stories denied, their contract covering the fact-check activity. We also corrected the description of Trendolizer to reflect that it detects viral content, not specifically viral misinformation. Lead Stories has not provided a revenue breakdown that would allow us to distinguish between income from Trendolizer licensing and fact-checking. Without such clarity, separating the two activities does not resolve the transparency concerns, especially given the stakes and the censorship abuses documented in Romania.
AFP Verificat
Fact-Checker, Content Producer
The first contact with AFP Verificat was stratospherically misleading. Their initial reply stated that, “as a news agency, we do digital investigations but do not directly moderate content, so we are not the right interlocutor.” Then providing a link to their… fact-checking activities. Of course their role under Meta’s Third-Party Fact-Checking programme makes them an active part of the moderation chain: when AFP Verificat rates a post as false, partly false, or missing context, Meta links the fact-check directly to the post and its algorithm can limit visibility or extend the label to other identical posts. In practice, their assessments shape moderation outcomes, even if AFP insists it does not “moderate.”
When pressed about transparency, AFP replied it wouldn’t waive confidentiality on its Meta contract, even if Meta itself were to agree. They also declined to disclose the amounts received under the programme, citing “business confidentiality.” On TikTok, AFP clarified they do participate in the platform’s global fact-checking programme in several EU countries (France, Belgium, Greece, Cyprus), but not in Romania.
AFP further confirmed that neither AFP Verificat nor AFP Fact Check has ever been invited to Romania’s Rapid Response System. They are also not direct signatories of the 2022 Code of Practice on Disinformation, though AFP Fact Check has contributed to EU-level DSA consultations. This creates a blurred line between AFP Verificat’s limited local role and AFP Fact Check’s broader European influence.
Editorial choices also highlight the selective framing at play. During Romania’s 2024 elections, AFP Fact Check published a piece clarifying that ballots remained valid even if ink bled through the thin paper. While technically correct, this “fact-check” ignored the real scandal: the ballot design allowed vote buyers and coercers to verify compliance at a glance—one stamp placed in the middle, the other at the bottom. AFP’s intervention reassured audiences, wary of votes secrecy and vote validity being compromised, focussing on a minor technicality while sidestepping a major democratic vulnerability.

Eurocomunicare
Fact-Checker, Government Consultant
Eurocomunicare only came to our attention through the EDMO website, where it is listed among Romanian fact-checkers. Outside of that, its visibility is minimal. Despite presenting itself as a fact-checking organization, it delivers very little content and has almost no public profile compared to other players in the field.
What stands out is its funding structure. Eurocomunicare has received money from EU contracts (notably in the health field), from the Romanian government, and from NATO-linked projects. This raises obvious questions of dependence and alignment, especially for an entity entrusted—at least on paper—with evaluating the accuracy of online content.
According to Cristina Guseth, director of Freedom House Romania, the team behind Eurocomunicare is indeed competent: “they accessed European funds with another institution, SNSPA, the University of Political Sciences.” In other words, Eurocomunicare appears to function as an academic-NGO hybrid, with funding channels and affiliations more visible than its actual output. Also, credentials delivered by the head of Freedom House could be the opposite of a green flag.
The paradox is striking: while officially listed as a fact-checker in Romania’s DSA ecosystem, Eurocomunicare produces little verifiable work for public scrutiny, yet enjoys access to sensitive funding streams and recognition from European structures. It could technically be invited to the Rapid Response System to flag content—we’re still waiting for a written confirmation that it hasn’t.
Freedom House
Ambiguous Role and Influence Networks
Nota: Due to a long-standing anger from Ms. Guseth against co-writer Patrick-André de Hillerin, the reporting on Freedom House has been covered by Stéphane Luçon.
Freedom House was not initially expected to appear in Romania’s DSA moderation architecture. Yet it is officially listed on EDMO as a fact-checker. Director Cristina Guseth rejected that label in our conversation: “We are not fact-checkers, how should I put it… accredited fact-checkers: we are not.” Instead, the organization operates mainly through PressHub.ro, a platform it funds and coordinates, which brings together around 40 local outlets.
Freedom House presents its mission as sustaining “quality journalism” in areas where local press has been captured by political-business networks, what Guseth calls “capture groups.” In this sense, it functions more as a network builder than a classic fact-checking body. Still, PressHub has produced counter-disinformation content resembling fact-checks—such as articles countering claims about Romania’s role in Ukraine—without following standardized methodology.
According to EU grant data obtained through Patriot for Europe’s FOI request to the Commission and published online, Freedom House has directly received over €800,000 from the EU over the last few years, within projects totaling more than €8 million. The significance lies both in the direct sum and in the organization’s ability to coordinate and distribute large projects, creating networks of partners and dependencies. Guseth nevertheless emphasized the fragility of Freedom House as an NGO—understaffed and grant-dependent—but confirmed close ties with Expert Forum and SNSPA, both prominent in Romania’s moderation landscape.
Our conversation with Ms. Guseth grew tense when we raised the censorship of journalists such as Ion Cristoiu, long known as one of her detractors, who accuse her of representing Soros and USAID influence. Her relationship with part of the press became strained after her attempt to join the Romanian executive as Minister of Justice in 2015—an appointment proposed in the aftermath of protests widely described as Soros-backed, which ultimately failed after her parliamentary hearing. This history partly explains her longstanding feud with de Hillerin, who criticized that attempted appointment at the time.
Several issues remain unclear. Freedom House declined to say whether dissenting views on Romania’s support for Ukraine should be considered legitimate—a sensitive point given Romania’s Prosecutor’s recent broad definition of “Russian influence,” pointing at speech that “creates anxiety among the population.” PressHub has published unbylined articles favorable to former minister Sebastian Burduja, a controversial figure who has encouraged SLAPPs1 against NGOs, and it has also carried articles supportive of Getica Group, a group of Romanian volunteers fighting in Ukraine but whose online activity included death threats to Romanian and EU Citizens, and “enemy lists” left uncondemned. Finally, Ms. Guseth revived past disputes over international adoptions, accusing critics of siding with “child traffickers”—a claim that touches on one of Romania’s most sensitive controversies, since the adoption ban was originally introduced to prevent further abuses and her alleged support to restart them can only be a matter of great concern.
Overall, Freedom House occupies an ambiguous place in Romania’s DSA ecosystem: listed as a fact-checker yet denying that role, producing counter-disinformation content without methodological clarity, coordinating EU-funded projects that extend its influence networks, and leaving several concerns unanswered. This combination justifies its relatively high DSA Risk Score of 7.
And again: if it weren’t for the “fact-checker” status displayed on EDMO, Freedom House would not even have been covered here.
Elie Wiesel Institute for the Study of the Holocaust in Romania (INSHR-EW)
Trusted Flagger
The Elie Wiesel Institute holds one of the most sensitive and legitimizing roles in Romania’s moderation architecture: it is a Trusted Flagger under the DSA, empowered to report antisemitic hate speech forbidden by law, Holocaust denial, and the cult of war criminals for fast-track removal by platforms.
The Institute itself has admitted that moderation is not a central activity: “Reporting content to platforms is not one of our main activities, attributions or areas of focus. Our obligation as a trusted flagger is to publish an annual report, which we will send to the DSC at the end of the year, to be made public.”
According to them, in its first seven months of activity, the INSHR-EW flagged 89 posts across Facebook, TikTok, X, and YouTube. Platforms acted in only 53 cases (~60% compliance). On TikTok, only half of the reports led to any action—usually geo-blocking rather than removal. Either platforms resisted, or the Institute’s reports stretched the definition of “illegal content.” The Institute further confirmed that many of its own reports never appear in the EU’s official DSA Statement of Reasons database, showing transparency gaps exactly where oversight is most needed.
The Institute’s annual report also blurs categories. Alongside classical antisemitism monitoring, they fold in themes such as countering the far-right, countering Russian influence, condemning Orthodox rehabilitation of former Legionaries (Iron Guard members), and even contesting narratives around anti-communist resistance. Such conflation risks weaponizing Holocaust remembrance to target broad categories of political dissent.
This concern is reinforced by public statements. In November 2024, the Institute declared:
“The far right in Romania isn’t just a vulnerability, it’s now a reality! It represents over 35% of the electorate’s choices. The far right means Holocaust denial, antisemitism, racism, denial of rights and freedoms for any minority, sovereigntism, external pro-Russian orientation, anti-Europe and anti-NATO.”
Depending on how one counts, this communiqué contains up to ten unfounded claims. Equating sovereigntism with antisemitism is especially toxic: if the defense of national sovereignty is treated as antisemitism, then democratic self-determination itself becomes a target. Far from strengthening the fight against antisemitism, such rhetoric risks discrediting it.
Three months later, Adina Marincea, the Institute’s researcher bearing the Trusted Flagger’s role, publicly advanced the theory of “antisemitism without Jews.” In a PressHub interview, she argued that words like “globalists,” “neo-Marxists,” “New World Order,” or criticism of Soros and Kissinger should be understood as coded antisemitism—even though only a few thousand Jews remain in Romania. Such claims are absent from the IHRA working definition of antisemitism, which the Institute itself is supposed to uphold. These statements were published by PressHub (powered by Freedom House). Marincea’s Trusted Flagger role compounds the concern: if antisemitism can be defined as any criticism of “globalists” or of Soros, are the Institute’s censorship powers extending far beyond genuine antisemitism? A tool meant to protect against “illegal content” thus risks becoming a political weapon against dissenting views.
To their credit, the Institute did engage in correspondence, providing numbers and confirming limits. They also clearly stated that they had not flagged any of the major independent voices we were tracking. Yet they refused to grant access to a full review of individual flagging instances.
The Wiesel Institute’s legitimacy rests on Holocaust remembrance. But its Trusted Flagger role, combined with partisan rhetoric, raises concerns that the DSA’s missing safeguards may allow it to drift into discretionary censorship.
Save the Children (Salvați Copiii România)
Trusted Flagger
Save the Children Romania participates to the DSA ecosystem as a Trusted Flagger. Its designated remit is child protection online, especially reporting child sexual abuse material and related illegal content. On paper, the mandate looks narrow and legitimate.
Responsivity, however, is close to nonexistent.
In response to formal questions, the organization acknowledged its role but refused to provide any meaningful statistics on moderation actions. They confirmed that they do not receive Statements of Reasons reference numbers from platforms—one of the key transparency safeguards in the DSA. The only commitment made was to release consolidated results at the end of 2025, long after their flagging activities had already been exercised.
The organization denied involvement in the Rapid Response System during elections. Yet questions remain unanswered about whether they have flagged or contributed to the suppression of critical independent voices in Romania’s media landscape. Despite repeated written requests, Salvați Copiii has not clarified this crucial point.
Concerns are compounded by at least one suspected abusive flagging incident: an Associated Press / Alamy photograph, republished by a leading conservative website to criticize a gay pride march, was reportedly targeted as “child pornography.” If Save The Children is behind this flagging (not confirmed at the time we publish this report), this would illustrate how easily a child-protection mandate can be misused to censor political or cultural criticism.
Internationally, Save the Children is not new to controversy. In Pakistan, the NGO faced a CIA-related scandal after it was alleged to have facilitated cover operations under humanitarian pretexts. That history underlines the need for heightened transparency whenever the organization operates in sensitive environments with censorship powers.
So far, however, Salvați Copiii’s role remains opaque. The refusal to clarify whether they touched content from prominent Romanian journalists and academics leaves open the possibility of discretionary or abusive interventions carried out under the guise of child protection.
Out-of-Court Settlement Bodies (OCSBs)
Under Article 21 of the DSA, users whose content is restricted should, in theory, be able to seek redress through independent out-of-court settlement bodies (OCSBs). In practice, this mechanism is almost non-existent in Romania.
Romania has no accredited OCSB at this stage. For a Romanian user, the only option is to reach out to foreign NGOs accredited elsewhere in the EU and hope they will take up the case. This already undermines accessibility and raises jurisdictional questions.
Even at the European level, the system looks dysfunctional. After directly contacting each of the EU-accredited OCSBs, only two replied (ADROIT, and USER RIGHTS) — and they confirmed they cannot handle Romanian proceedings, offering only English with automated translation support.
Moreover, the independence of some OCSBs is questionable. Appeal Center Europe, for example, has received support from Meta among its backers—a glaring conflict of interest for a body meant to adjudicate disputes involving platform decisions. If redress mechanisms are partially funded by the very corporations they are supposed to oversee, the credibility of the system collapses.
In short, the OCSB layer of the DSA’s “trust and safety” architecture currently offers little more than a façade: inaccessible for Romanians, unreliable across the EU, and in some cases compromised by conflicts of interest.
For ADROIT, our estimated DSA RISK SCORE is 1 and we salute their willingness to engage and potentially defend some of the cases mentioned to them.
For USER RIGHTS, our estimated DSA RISK SCORE is 1 and we salute their willingness to engage and potentially defend some of the cases mentioned to them.
For ACE, our estimated DSA RISK SCORE is 3,5 - no contact could be established so far.
Moderating Teams
Behind every flag, fact-check, or takedown order, the final decision still rests with platform moderation teams. Yet here, transparency is almost entirely absent. The DSA’s Statement of Reasons database should record every moderation decision taken by platforms, but in practice we found barely any trace of the censorship operations that occurred in Romania in 2025.
Independently from this lack of transparency, moderation teams are vulnerable to bias and infiltration. As highlighted in Sam Biddle’s now-famous reporting, moderators can become tools of state influence rather than neutral arbiters of rules. This risk is particularly acute in Romania, where part of TikTok’s moderation has been subcontracted to firms far from public scrutiny—for instance, a moderation team in Brașov is managed by the leading French call center company Teleperformance (after it purchased Majorel).
This information was well hidden, but we were able to uncover it through the transcript of hearings at the French Parliament (Assemblée nationale, Rapport d’enquête sur TikTok, Sept 2025) and a Business Insider article, where a moderator recounted his experience in 2023: “It took me 2 months to recover from working as a TikTok moderator. I made less than $7 an hour. Every day in the office, I would see my coworkers cry.”)
The opacity runs deeper. No public spokesperson exists for moderation in Romanian, at any of the leading social networks, leaving users without any point of contact.
The only exception appears to be TikTok, but even here the experience was worse than disappointing: two PR representatives, Robert Bogdanffy and Paolo Ganino, initially replied and then simply stopped answering messages. No official explanations are ever provided, even when an appeal succeeds and a post is reinstated. Users remain in the dark about why the original decision was made, what standards were applied, and who exactly made the call.
Moreover, ANCOM itself—the Romanian Digital Services Coordinator—hits the same wall. According to its Vice-President, even formal requests to obtain the list of moderators responsible for Romanian-language content were flatly refused by the platforms.
In effect, platform moderation in Romania is a black box—not only for users, but even for the national authority tasked with enforcing the DSA. Outsourcing, lack of accountability, and refusal of basic transparency together create an environment where moderation can be arbitrary, unaccountable, and potentially abusive.
For this reason, we would give a risk score of 3,25 to X, whose Community Notes at least offer users some protection against fact-checker abuse; 4 to Meta, because it has not terminated its fact-checker contracts and still relies on them; and 4 to TikTok, which represents an additional risk both through its dishonest engagement in conversation and its reliance on opaque outsourcing methods of moderation.
X - DSA RISK: 3,25
META - DSA RISK: 4
TIKTOK - DSA RISK: 4
Why the DSA Fails at Transparency, Appeal, and Redress
The Digital Services Act promised to usher in a new era of digital rule of law. It was meant to ensure that, just as in a functioning justice system, transparency and due process would protect users. People would know why their content was removed, have a right to appeal, and see real checks and balances in action. The European Parliament even reinforced the original proposal with extra safeguards for fundamental rights.
But what was promised on paper has turned into a starkly different reality in Romania. Instead of a fair and transparent system, we see a web of overlapping roles and conflicts of interest. The same actors act as judges, prosecutors, and informants. NGOs that create content also flag competitors. Flaggers and fact-checkers blur into the same entities, leaving no real accountability. And while the DSA’s text mimics the idea of justice, in practice user rights remain a distant promise.
Independent journalists are censored without warning or explanation. Constitutional guarantees like freedom of expression (Article 30) and the right to information (Article 31) are violated in plain sight. The DSA, far from securing rights, has become a tool for those in power to silence critical voices. This isn’t just a Romanian anomaly; it’s a warning for Europe and beyond.
The Database That Hides Everything
The DSA’s Statement of Reasons database should be the cornerstone of transparency—tracking every moderation action with clear explanations. Instead, most Romanian journalist cases have no entries at all. When entries do exist, they provide meaningless boilerplate that tells users nothing about why their content was removed.
Most critically, the database never identifies which trusted flaggers, fact-checkers, or government agencies triggered the censorship. Users cannot match their experiences to database entries, making verification impossible. The scale of this opacity is staggering: when contacted, even the Wiesel Institute—Romania’s Trusted Flagger for antisemitic content—confirmed they could only track back one of the more than forty instances they had flagged since receiving their status.
This isn’t transparency; it’s transparency theater. The database exists to give the appearance of accountability while ensuring that the real decision-makers remain hidden from view.
Appeals That Go Nowhere
The DSA promised meaningful recourse through internal appeals and out-of-court settlement bodies. In practice, platforms appear to reject appeals almost automatically. Out-of-court settlement bodies are out of reach for regular Romanian users and, as mentioned earlier, are often unresponsive even when contacted.
At the systemic level, the failures are even more pronounced. ANCOM’s restrictive interpretation of Article 53 may block complaints about systematic discrimination. In a written response, it argued that a journalist could not submit other journalists’ censorship cases as evidence of systemic risk—a position another Digital Services Coordinator has confirmed is incorrect. We have been waiting for almost a month for ANCOM to acknowledge this error and escalate the complaint to the actually responsible coordinator, Coimisiún na Meán (CnaM) in Ireland. Only today, September 24, as we publish, did ANCOM send a partial reply: it claims to have forwarded part of our complaint concerning Facebook and X, while requesting additional information about the TikTok account’s ownership mentioned. This raises concerns about the seriousness of their intent to address the systemic risks identified in the complaint, and suggests a possible attempt to further delay transparency regarding TikTok’s censorship of journalists.
The Real Problem: A Closed Loop of Power
The core issue isn’t just bureaucratic dysfunction—it’s role redundancy that creates a circular system where the same actors reinforce one another’s decisions. Organizations funded to fight “disinformation” have incentives to label even satire or political jokes as harmful. And in Romania’s moderation architecture, there is not a single grassroots NGO sustained by members or subscriptions.
Every player depends on Big Tech, government grants (domestic or foreign), Soros foundation, or other major funders. This produces a façade of civil society: government-aligned organizations dressed up as NGOs. Their political alignment and push for preferred narratives represent a textbook case of astroturfing. Fact-checkers, trusted flaggers, and policy advocates co-sign one another’s open letters demanding more powers for themselves, instead of offering checks and balances or defending users’ rights.
Their alignment with executive power makes them look less like watchdogs and more like auxiliaries of state propaganda and censorship—a reality underscored by the annulment of Romania’s 2024 election on flimsy grounds.
An illustrative case is the disinformation spread as a “fact-check” by Funky Citizens regarding a lobbying contract between the AUR party (opposition, sovereigntist) and a U.S. lobbying agency. Their fact-check claimed to verify whether a payment had been made and concluded that it had—while presenting only proof of the contract’s registration under the Foreign Agents Registration Act (FARA). Yet FARA registration is a prerequisite for any potential payment, not proof that one occurred. There is a clear difference between registering a contract and actually transferring money. By conflating the two, the fact-check itself became disinformation.
Co-author Patrick-André de Hillerin explained this distinction clearly in a Facebook post analyzing several lobbying contracts registered under FARA.
We raised this issue repeatedly in our email exchanges with Funky Citizens and their editorial team (“Factual”), but they refused to acknowledge the difference between registration and payment. Such stubbornness—or outright bad faith—can only be corrected if fact-checkers themselves are subject to scrutiny. This is why a system allowing fact-checks to be challenged, whether through community notes, peer fact-checkers, or input from the “wronged” side, is essential.
Executive Power Capture
When ANCOM fails to supervise trusted flaggers, rapid-response participants, fact-checkers, and other moderation actors—when it refuses to investigate the systematic censorship of independent journalists—the DSA becomes a tool of authoritarian control. Romania’s downgrade from “Flawed Democracy” to “Hybrid Regime” in the Economist Democracy Index reflects this broader backsliding.
In this context, the DSA, designed to bring transparency and user protection, appears weaponized by executive power to silence independent voices. This violates Romania’s own constitutional guarantees of freedom of expression (Article 30) and the right to information (Article 31): users see only the platform’s final decision—never the chain of government agencies, NGOs, and foreign actors who influenced it.
The DSA’s safeguards don’t just fail; they provide cover for a system of political control operating in the shadows.
Censorship Everywhere, Justice Nowhere
To paraphrase Victor Hugo: under the DSA as applied in Romania, censorship is everywhere, justice nowhere.
The DSA’s safeguards target the wrong actors. Users can challenge platforms but never the entities that actually trigger censorship. The real threat isn’t just the platforms; it’s the opaque web of government agencies, NGOs, and foreign actors empowered by the DSA while shielded from scrutiny. Transparency mechanisms become a smokescreen for the very abuses they were meant to prevent.
The DSA was built on the assumption that platforms were the primary threat to users’ rights. In reality, the greater risk now comes from the network of third-party actors who exploit the DSA’s framework to wield censorship power while avoiding accountability. Platforms become mere executors of decisions taken elsewhere, while the supposed safeguards—databases, appeals, settlement bodies—either don’t function or are controlled by the same actors driving the censorship.
This shouldn’t surprise anyone. Social networks once posed the biggest challenge to both state propaganda and the news industry’s monopoly on narrative control. When Obama was elected thanks to a Facebook campaign, nobody complained. But when Brexit, Trump, and the Yellow Vest movement upset the establishment, the response was very different.
In France, the Yellow Vests used social media both to organize protests and to document state repression (see David Dufresne’s reporting on police violence and the recent Thomas Fazi/Pascal Clérotte report on the French Twitter Files). The current implementation of the DSA looks like a reaction to that challenge—an attempt to reassert narrative control under the guise of fighting “disinformation.”
This is the paradox at the heart of Romania’s experience—and a warning for all of Europe. What was designed as digital due process has instead become a system for avoiding accountability while maintaining the appearance of justice. Until that misdirection is corrected, the DSA will continue to serve as cover for the abusive censorship it was supposed to prevent.
Final Remark on the Platforms
Let us not imagine that the platforms are innocent bystanders. They remain among the most profitable corporations in history and have the means to deliver a better experience to the users.
A good actor—even under political pressure—would still provide users with clear explanations, disclose who initiated a takedown, and dedicate resources to real user support. Instead, they hide behind the DSA. Their silence is complicity. A full release of data on the flaggers and their actions is the only way to ensure transparency about what happened in Romania, particularly regarding the series of censorship instances detailed below.
Meet the Censored: Gaza, Conservative Voices, and Jokes
The systematic nature of censorship in Romania becomes visible when individual cases are examined side by side. These are not isolated incidents or algorithmic accidents. They form a pattern: independent journalists, critical academics, and even satirists targeted across multiple platforms, often without explanation, appeal, or accountability.
⸻
Ion Cristoiu — Veteran TV Personality Silenced
One of Romania's best-known TV journalists, Cristoiu saw his TikTok account suspended on April 9, 2025. No reason was communicated. A decades-long presence in Romanian media vanished overnight, cutting off his access to younger audiences who had migrated to TikTok.
Why it matters: Silencing Cristoiu shows that even mainstream, long-established journalists are not immune. It demonstrates how easily a lifetime of influence can be erased by a single opaque platform decision.
⸻
Robert Turcescu — From Scrutiny to Erasure
Another veteran journalist, Turcescu, faced a campaign of escalating restrictions:
April 14, 2025 — TikTok removed his commentary on a U.S. delegation overseeing Romania's electoral process.
May 6, 2025 — A video analyzing "Operation NDP," an alleged strategy to secure Nicușor Dan's rise to power, was flagged.
July 24, 2025 — His entire TikTok account was permanently suspended.
What began as the takedown of individual posts ended with a total erasure of his digital presence on TikTok, while his content is primarily video.
⸻
Marius Tucă — Multiform Pressure
Known for sharp editorials, Tucă's work came under attack on several fronts:
March 14, 2025 — An editorial on the alleged "Coup d'État" was flagged by the CNA. Tucă later self-censored to protect his accounts.
May 28, 2025 — Two Facebook posts were removed for using the name or image of Telegram's Pavel Durov, though the same article was later accepted after minor metadata edits.
Feb 21, Mar 25, Apr 16, 2025 — TikTok videos were repeatedly removed.
The pattern shows how pressure across platforms can force even established voices into self-censorship. Mr. Tucă, whose whole archive—including video from the 90s, before YouTube even existed—has been thoroughly uploaded by his team, witnessed to us the direct chilling effect of the actions taken against him, because he feared losing an immense online video database of his work.
⸻
Stéphane Luçon & Le Monde diplomatique — Targeted on X, Facebook and TikTok
As publisher of the translated edition of Le Monde diplomatique, in Romanian, Luçon (co-author of this report) faced moderation on the 3 main platforms:
Since March 2024 — The Le Monde diplomatique Romania Facebook page was blocked from posting after publishing Benoît Bréville's famous editorial "Ukraine and Gaza: double standards" (in French: "Si les vies se valaient"), then disappeared from the manager's interface a year later.
May 21, 2025 — Luçon's personal X (Twitter) account (@sfglucon) was suspended minutes after sharing an interview excerpt where the President-elect admitted his opponent would not have left the EU or NATO, but would have blocked support for Ukraine and Moldova—a statement contradicting the fear campaign spread, including by partisan participants of Romania’s moderating NGOs.
May 28, 2024 — The Le Monde diplomatique TikTok account was heavily shadow-banned after posting a screenshot of deputy director Anne-Cécile Robert's article: "The International Court of Justice identifies a plausible risk of genocide in Gaza." Robert, a French journalist specializing in European institutions and Africa and member of Le Monde diplomatique's editorial board, holds a PhD in European Union law and is associate professor at the Institute of European Studies of the Université Paris-VIII.
In this case, both Romanian political commentary and international reporting on Gaza became triggers for censorship.
⸻
Andrei Murgescu & the 1776 Podcast — Satire Targeted
Murgescu's satirical post of May 20, 2025—an ironic jab using an authentic photo of President-elect Nicușor Dan—was fact-checked as "false." Earlier, on May 11, 2023, a full-length debate shared by the 1776 Podcast was flagged as "missing context," despite linking to an upcoming live discussion.
The absurdity is telling: satire and debate themes are treated as misinformation.
⸻
Victor Roncea & ActiveNews — Mislabeling as Abuse
On May 18, 2025, both Roncea's personal and ActiveNews Facebook accounts were threatened with suspension. The trigger? Republishing a stock AP/Alamy photo—publicly available elsewhere on the platform—flagged as "child nudity and sexual exploitation." Roncea and his leading website, ActiveNews, do represent one of the most important platforms for conservatives, orthodoxism, MAGA, critics of the "globalists," people criticizing sanitary authoritarianism and lack of "principle of precaution" when it comes to COVID vaccines.
Whether one agrees with Roncea's views or not, the inaccurate flagging leading to threats of account suspension demonstrates how unchecked power can be misused. It is not clear to us at the moment if somebody from Save The Children (Trusted Flagger empowered to report illegal content about child exploitation) or a coordinated campaign by users has triggered the threat of account suspension.
⸻
Vlad Mercori — Investigative Journalism Punished
Vlad Mercori is one of the most successful investigative journalists on TikTok, with his incisive videos challenging powers and attacking virtually everybody on the political and business landscape. In August/September 2025, he saw his TikTok reach collapse by 80% after covering an overpricing scandal involving electric school minibus procurement in Bihor County during Ilie Bolojan's tenure as county president (now Prime Minister), a matter involving potential misuse of European funds. Critical coverage of EU-funded projects colliding with the direct interests of the current Romanian executive power seem to have triggered the brutal shadow-banning of his account.
When contacted to comment on the status of Mercori’s account, Mr. Popescu, ANCOM's VP, personally confirmed to us that it looked like an obvious shadow-ban by TikTok.
⸻
Paul Dragoș Aligica — The Restricted Academic
Paul Dragoș Aligica — Professor of Governance at the University of Bucharest, Senior Research Fellow at George Mason University, Senior Nonresident Scholar at the University of Pittsburgh, member of the Romanian Academy and of the Academy of Europe, with books published by Oxford and Cambridge University Presses — is a critical voice in Romanian mass media, regularly publishing detailed analyses on governance, information flow and freedom of expression.
His Facebook account has been on "restricted" status since an undetermined date, identified as such on August 25, 2025. The restrictions are based on posts and pictures already removed by Facebook in March 2020 and April 2023, with no clear "rule" broken and no resolution path available.
⸻
A System, Not Accidents
Across these cases, the narrative enforcement pattern is unmistakable:
Gaza reporting, conservative commentary, and satire are disproportionately targeted.
Veteran journalists, academics, and independent publishers face bans, shadow bans, or adapt and self-censor.
Appeals go nowhere; explanations are never given.
In each case, the safeguards promised by the DSA—transparency, appeal, redress—were absent.
These examples raise serious questions: Is it just the malfunction of a system hijacked to serve power? Or is it the intended purpose of the DSA: a censorship architecture designed to silence independent voices while hiding the decision-makers behind a façade of processes that only exist on paper?
Who Will Hold the Censors Accountable?
The evidence is overwhelming: Romania's DSA implementation has been captured by political interests and weaponized against independent journalism. Instead of a just system, with proper governance, the current DSA implementation is an architecture of narrative enforcement—one that silences critics while protecting those who abuse it.
But the problem is spreading. The abuses now extend beyond Romania's borders, with content being censored about Moldova's election (not even an EU/DSA country).
There is also a real risk that Romanian-vetted actors will play similar roles in other countries, or will serve as model, spreading the same abuses to other places, like the Czech Republic where a Rapid Response System has been quietly activated, and only unveiled because we asked the question during our investigation. Meanwhile, members of the Romanian executive boasted about how the Romanian example is being studied elsewhere.
Rapid Response System participants receive a direct access to report content to the platforms. NGOs like Funky Citizens or Demagog CZ, who are already fact-checkers for Meta, gain an additional capacity to report content to other platforms. NGOs like Expert Forum, which previously had no official reporting channel, now possess one thanks to the Rapid Response System.
The most concerning part, as recently reported, is that participants in the Rapid Response System try to frame it as a voluntary tool for the public good, while refusing even the minimum transparency of an open desk to disclose their work. The “official” https://disinfocode.eu website has no contact form, no official publisher, and even infringes Europe’s regulation by collecting Google Analytics cookies without displaying a proper banner.
Yet the participants are in fact well organized—they operate through a Permanent Task Force, according to an official answer received from the Romanian DSC:
“The Task Force is chaired by the European Commission and includes the signatories of the Code of Practice on Disinformation, as well as representatives of the European Digital Media Observatory (EDMO) —that is fact-checkers—, the European External Action Service (EEAS)—that is diplomats—, and the European Regulators Group for Audiovisual Media Services (ERGA)—that is media regulatory institutions. The Task Force has been entrusted with specific responsibilities such as the establishment of a Rapid Response System during election periods and crises.”
The absence of a mandatory national participant in the process of designating RRS flaggers makes this all the more worrying—it is EU-level censorship applied to national debates, with a dramatic lack of transparency.
Moreover, at publishing time, we just learned from a participant to the Rapid Response System that the Commission may be pushing for secrecy about the participants to the Rapid Response System “in order to protect the ingrity of those organizations”.
The Accountability Vacuum
Many key actors in Romania's censorship machinery have crossed red lines, yet none face consequences. The current system creates perfect impunity for bad actors:
Expert Forum's status should be removed from the list of Code of Practices on Disinformation for operating with undisclosed funding, refusing to explain their censorship actions and bearing too many conflicts-of-interests. Their close connection to the Moldovan executive—recently producing a report about opposition posts in Moldova that was even submitted to Romania's National Audiovisual Council—shows how far outside legitimate boundaries they operate.
Funky Citizens' Meta contract should be suspended for abusive fact-checking practices and made fully public (to their credit, as mentioned earlier in the report, they have agreed to full transparency of their Meta contract if Meta waives the non-disclosure agreement). Their label under EDMO, EFCSN and IFCN should be suspended until a due auditing process identifies how may times they waged “disinformation” with their own fact-check content, as for the lobbying contract mentioned in this survey. Their eligibility for Rapid Response System participation should be revoked entirely—you cannot be content creator, fact-checker, and content-flagger simultaneously.
ANCOM should lose its DSC role if it deliberately refuses to protect user rights. During over three months of investigation, Romania's Digital Services Coordinator has misled us with partial responses and a dismissive attitude, refusing to acknowledge its responsibility to investigate whether the DSA is being weaponized against independent journalists. At publishing time, we don’t know if ANCOM will actually pass the complaint lodged under DSA Article 53 to the Irish DSC–the one handling Facebook, TikTok and X.
CNA members who meddle with Moldovan content—clearly outside their jurisdiction—shouldn’t be holding such position, yet they are a majority within the council.
Freedom House's complacency (or complicity) on highly controversial matters reveals an even deeper problem—this isn't just a DSA-level issue, but a question of how actors who escape normal rule-of-law consequences can receive support from DSA participants. This exemplifies how Romania operates as a hybrid regime, where bad actors leverage the massive power of EU and international funding for their own clientelist interests.
Yet none of this is likely to happen. The DSA's enforcement mechanisms assume good-faith actors operating within legitimate boundaries and offer no way to punish bad actors. Romania shows what happens when bad-faith actors seize the levers of moderation.
The Transparency Crisis
The transparency crisis is equally stark. Only the CNA maintains a proper public record—flawed and impossible to browse properly, but at least available and public.
Every other actor in this ecosystem operates in complete darkness. Users never learn who flagged them, what evidence was submitted, or how decisions were reached.
We need total transparency: actual reasons for every action, transparency on moderation decisions, full disclosure of funding sources of participants, and clear identification of which partisan actors are driving censorship campaigns. Until we know who is flagging what content and why, the DSA remains a black box serving political interests rather than user rights.
If the DSA is to have any credibility, European citizens must obtain real-time disclosure of every flag, every takedown, and every actor behind them. Anything less is a blank check for abuse.
Social media platforms are the modern public square. It needs open source algorithms too. We can't navigate public debate if someone, no matter who, is tweaking the reach.
We either live in a "Manufacturing Consent" dystopia on steroids, or take back the tools of democratic public debate offered by the information technology revolution. There is no middle way.
Articles 21 and 53: Aiming at the Wrong Target
The DSA's supposed user protection mechanisms—Articles 21 and 53, the safeguards meant to protect users—expose a fundamental flaw. Appeals and systemic complaints are directed at platforms, while the true decision-influencers remain invisible. Why are the platforms held responsible for censorship decisions they didn't initiate?
As Mark Zuckerberg, CEO of Meta, admitted: "Fact-checkers have just been too politically biased and have destroyed more trust than they've created." This should draw a lot of scrutinity on the various participants of the moderation architecture.
Before that, the Twitter Files had revealed extensive coordination between platforms and governments. As journalist Matt Taibbi testified before the US Congress: "When Twitter Files reporters were given access to Twitter internal documents last year, we first focused on the company, which at times acted like a power above government. But Twitter was more like a partner to government... With other tech firms it held a regular 'industry meeting' with FBI and DHS, and developed a formal system for receiving thousands of content reports from every corner of government: HHS, Treasury, NSA, even local police."
Michael Shellenberger, testifying at the same hearings, drew the larger conclusion in his statement: “Today American taxpayers are unwittingly financing the growth and power of a censorship industrial complex run by America's scientific and technological elite which endangers our liberties and democracy.”
The lesson for Europe should have been crystal clear. The DSA’s implementation replicates the same dynamics uncovered in the United States: enforcement, without visible government fingerprints, by coercing social networks into a choice—apply censorship or pay fines—it shifts power to actors who face no checks and balances.
In Europe’s DSA regime, platforms face fines, users are silenced, and censors walk free.
A Warning for Europe
What happened in Romania shows exactly why voters across Europe should demand transparency. The Rapid Response System activates quietly, without public announcement, and flagging powers are handed to partisan NGOs with opaque funding and ties to power. When organizations act as content producers, fact-checkers, flaggers, and policy advocates all at once, the result is exponential conflicts of interest—with no transparency and no accountability.
We shouldn't consider Romania an anomaly because of its specific status as the first hybrid regime of the European Union (where member states are expected to be democracies). Take the Czech Republic: we only discovered that an NGO, Demagog CZ, had quietly receives Rapid Response powers, because we directly asked them. Technically, it could let them influence what citizens see online during elections—without voters ever knowing what was flagged, why, or by whom. The problem is not about proving individual cases of abuse. The problem is that the system allows abuse while keeping it invisible, as if it was designed for that purpose, and these flaws seem ready to spread across the whole European Union.
The Path Forward
Until Europe builds true transparency and accountability into the DSA, the law will remain what Romania has revealed it to be: not a shield for citizens, but a shield for censors — and a weapon for power.
The DSA promised digital rule of law. In Romania, it delivered digital injustice. Unless Europe corrects course, this Injustice System will not stop at Romania’s borders.
This report was authored by Stéphane Luçon and Patrick-André de Hillerin. No part of this publication may be reproduced, in English or in any other language, without the explicit consent of the authors.
Nota: This is an ongoing investigation into systematic DSA violations and censorship abuses curtailing freedom of speech. The investigation aims to further provide clarity and obtain answers from the European Commission, national DSCs, and all the actors involved themselves.
Strategic Lawsuit Again Public Participation






















