Think Tank
Video chats
Online event: The State of Moldova
7 October 2025
30 July 2025
Information manipulation and interference by foreign and domestic actors are rising, with recent cases showing local enablers. The EU needs proactive, strategic tools to counter the growing threat.
Click here to download the pdf
*Information manipulation and interference are increasingly conducted by domestic actors, alongside foreign ones. The EU’s FIMI definition remains too narrow for this reality.
*Romania’s 2024 vote was marred by AI manipulation and illegal TikTok campaigning.
*270+ vehicles in Germany were sabotaged in a plot tied to foreign intelligence.
*Slovakia, Hungary and Georgia face backlash over ‘foreign agent’ NGO laws.
*Disinformation now includes influencers, fakes and deepfakes across CEE.
*Physical subversion is rising: protests, bribes and targeted violence.
*DSA enforcement remains patchy despite strong regulatory tools.
AI-driven disinformation and hybrid threats are escalating. Without a strong, coordinated EU response, not only the next European elections but also broader social cohesion and adherence to democratic values are at systemic risk.
The Growing Threat of Disinformation and Interference
Information manipulation and interference have surged to unprecedented levels across Europe, driven not only by foreign actors but increasingly by domestic ones. These expanded origins and tactics now seriously destabilise political systems – as seen in Romania – and threaten fair competition in countries including Germany, Poland and Moldova. DIMI (Democracy Information Manipulation and Interference), encompassing both domestic and foreign actors (FIMI), poses a major threat to societal stability and, amid geopolitical uncertainty, to national and continental security.
FIMI/DIMI tactics have evolved from online disinformation to physical subversion, terrorism and sabotage. For example, Romanian social media manipulation coincided with direct foreign support for a presidential candidate. In Hungary, government-backed political advertising skews elections, while anti-EU narratives from members in Slovakia’s government deepen divisions.
As interference increasingly shifts from virtual to physical realms, a broader range of state and EU institutions must engage in coordinated countermeasures. Mere reaction is insufficient; proactive prevention is essential – especially in already polarised societies where extremist voices gain traction.
Digital platforms have fostered echo chambers that isolate and amplify anti-democratic views. Yet, platform responses remain inadequate. Meta, for instance, has cut back its fact-checking and election teams, and from October 2025 will ban political, electoral, and social issue ads in the EU, citing the legal uncertainty and complexity of the new TTPA regulation
Effectively countering these threats while upholding Western values and social cohesion becomes a matter of national security, and demands strong institutions, clear laws and rigorous enforcement. Reactive approaches have proven inadequate – countermeasures must evolve in sophistication online and strength offline. Protecting democracy and building resilience against interference are now vital security priorities.
The broad scope of DIMI is evident across the Visegrad+ countries (Poland, Czechia, Romania, Slovakia and Hungary).
Pro-Russian, Anti-EU Narratives: External actors, particularly Russia, have pushed narratives undermining trust in Romania’s democratic institutions and alliances. These narratives often align with ultra-conservative, anti-liberal factions to destabilise politics. Similar messaging is widespread in Visegrad+, especially in Hungary and Slovakia, where senior officials propagate such views.
Legislative Pressure on NGOs: In July 2024, the European Commission warned Slovakia against adopting a law requiring NGOs receiving foreign funding to label themselves as ‘organisations with foreign support’, citing risks of stigmatising civil society and undermining freedoms of association and expression. Similar, though more severe, legislation exists in Georgia. The law came into effect on 1 June 2025. Although the labelling requirement was ultimately removed, it still imposes additional administrative burdens on Slovak civil society.
Information Manipulation and Disinformation Campaigns: Hungary faces criticism for information control favouring certain political narratives. During election periods, Poland sees populist and divisive disinformation campaigns exploiting societal fractures to influence voters.
Illegal Electoral Campaigns: Social media campaigns using fake accounts and paid influencers to spread false narratives violate electoral finance rules. A notable example is Romania’s 2024 presidential election, where Calin Georgescu orchestrated an illegal campaign.
Physical Sabotage: While Visegrad+ countries have not recently experienced physical sabotage, neighbouring Germany faced coordinated sabotage in early 2025. Over 270 cars had their exhaust pipes blocked with construction foam, accompanied by ‘Be greener’ stickers linked to the German Economics Minister. Initially blamed on radical activists, investigations suggest foreign operatives orchestrated these acts.
Over the past decade, as external and internal malign interference has grown more sophisticated, the EU’s response to disinformation and hybrid threats has mostly been reactive, often lagging behind attackers.
The EU began addressing information manipulation in 2015 with the East Stratcom Task Force, aimed at helping civic actors identify fake news. However, early efforts were limited by underfunding and lack of enforcement powers.
With the rise of blatant foreign interference, the EU expanded its approach to include FIMI. The 2022 Digital Services Act (DSA) introduced potential sanctions for online platforms failing to curb misinformation, giving the EU important leverage – but enforcement remains limited. TikTok, under multiple investigations, illustrates this gap, as no sanctions have yet been applied. Similarly, the European Commission has delayed its investigation into platform X for allegedly violating the DSA, likely awaiting progress in EU-US trade talks. The probe, which was expected before the summer break, has now been postponed. While the Commission insists enforcement is independent, the timing raises questions of global revenue if found in breach.
Beyond the DSA, EU initiatives focus on detection and information sharing through tools like the EEAS’s Rapid Alert System (RAS) and FIMI-ISAC. The Commission also publishes annual Rule of Law reports and launched the European Democracy Shield in 2024 to tackle threats to elections, media freedom, and disinformation.
The Council lacks a dedicated strategic communication unit but includes such functions within its Directorate-General for Communication and Information (DG COMM). Counter-disinformation efforts are mainly coordinated via the EEAS.
In December 2024, Parliament formed a 33-member committee with a 12-month mandate to address FIMI and DIMI, with a first report expected later in 2025. Earlier, the 2022 INGE committee called for sanctions against disinformation spreaders, a unified EU foreign interference strategy, and stronger media literacy. While the DSA partly responds to these calls, many recommendations remain unfulfilled.
Since the adoption of the DSA and the two INGE reports, three new elements have begun to redefine the nature of information manipulation and interference. These developments are not yet – or not sufficiently – reflected in the EU’s current approach.
Interference by domestic actors who manipulate information and act against democratic principles has significantly increased. Foreign influence is now often channelled through these actors, amplifying their actions and statements. Extremist or anti-democratic politicians – whether in opposition or in power – as well as mainstream journalists, influencers and celebrities, have joined the information manipulation space. They exploit their platforms and notoriety to spread falsehoods and incite destabilising action.
To respond effectively, EU action must recognise that both foreign and domestic actors are integral to today’s threat landscape.
Artificial intelligence has rapidly increased the reach, speed and perceived credibility of misinformation. A 2024 study showed that 86% of people worldwide had encountered fake news – and AI-generated content was often seen as more trustworthy than human-generated falsehoods.
AI-generated materials now flood social media. During election campaigns, they are used to create and spread propaganda. For instance, Hungary’s Jobbik Party has used AI-generated images to portray immigrants negatively, aiming to influence public opinion through emotionally charged content. In Romania, AI was deployed to fabricate videos of candidates in compromising situations or making false statements.
Detection tools remain behind the curve, and countermeasures have yet to catch up with the evolving threat.
Malign interference has moved beyond digital manipulation to include real-world subversion. Foreign-funded actors have paid citizens to oppose candidates or amplify targeted narratives, instigated protests and disruptive actions, and orchestrated violence or sabotage. These activities represent a shift to a more extreme and dangerous form of interference.
Such physical acts are more visible and fall more clearly under criminal law – making them easier to prosecute where political will and institutional capacity exist. At the same time, their spectacle further mobilises extremist supporters and deepens societal polarisation.
The impact of information manipulation and interference – both foreign and domestic – on the stability and integrity of a state can be immense, as Romania’s recent experience shows. Failing to act against fake accounts inciting unrest and allowing extremist demonstrations to escalate can quickly trigger an uncontrollable chain reaction.
State institutions play a decisive role in countering DIMI. Proactive monitoring of suspicious online and offline activity is essential to identify threats, prevent escalation and hold malign actors accountable. Effective mitigation depends on interagency cooperation and real-time information-sharing. A coordinated European focus on DIMI ‘hotspots’ increases the impact of national-level countermeasures.
Detection tools must evolve as fast as the threat landscape. Artificial intelligence can help identify malign operations and attribute them correctly – a crucial step for applying the right legal and policy tools. AI can also help differentiate between domestic and foreign sources of manipulation.
The private sector is advancing fast:
The EU’s Horizon Europe programme co-funds vera.ai, which provides journalists and human rights investigators with AI-powered tools for deepfake detection, image verification and network analysis.
Legislation must be updated to address evolving DIMI tactics, balancing individual freedoms with the need to protect national and EU-wide security. While over-legislation should be avoided, better implementation of existing EU rules at the national level would strengthen the legal toolbox.
Key instruments already exist, including:
While CSOs remain important in identifying and reporting DIMI/FIMI and strengthening societal resilience, they are no longer the first line of defence. The increasing subversiveness of threats requires well-equipped state institutions to take the lead – with proper mandates, training and resources. However, cooperation with civil society and independent media is essential to reinforce democratic resilience and ensure political accountability.
A coordinated response involving national institutions, civic actors and European institutions is essential. EU regulations empower national authorities to exert pressure on platforms, but only national actors can implement concrete legal and security responses. Their collaboration multiplies the impact of each intervention.
Certain groups are more exposed to manipulation due to limited digital access, low media literacy, mistrust in institutions or existing marginalisation. Women – especially those in public life or from minority backgrounds – are increasingly targeted by gendered disinformation, which seeks to silence, intimidate and discourage their participation in democratic processes. This tactic not only amplifies existing inequalities but also weakens democratic resilience as a whole. These vulnerable groups also include:
Among these, the Roma community remains the most vulnerable. Their longstanding exclusion has fuelled anti-system sentiment, which malign actors readily exploit. As demonstrated in Romania’s recent elections – and confirmed by Roma for Democracy/Roma for Europe – boosting the resilience of these groups is critical to democratic stability.
There is no one-size-fits-all solution. Targeted, tailored actions are essential.
Alina Inayeh
Galan Dall, Magda Jakubowska, Staś Kaleta, Tomasz Kasprowicz, Anna Kuczyńska, Natalia Kurpiewska, Magdalena Przedmojska, Wojciech Przybylski, Albin Sybera, Luca Soltész and Simon Xiao.
Radu Albu-Comanescu (Romania), Merili Arjakas (Estonia), Alina Bârgăoanu (Romania), Bohdan Bernatskyi (Ukraine), Marysia Ciupka (Poland), Spasimir Domaradzki (Poland/Bulgaria), Martin Ehl (Czechia), Artur Nowak-Far (Poland), Jan Farfał (Poland), Oksana Forostyna (Ukraine), Philipp Fritz (Germany), Ognyan Georgiev (Bulgaria), Marzenna Guz-Vetter (Poland), Jarosław Gwizdak (Poland), Pavel Havlicek (Czechia), Alina Inayeh (Romania), Ruslanas Iržikevičius (Lithuania), Krzysztof Izdebski (Poland), Staś Kaleta (United Kingdom), Matej Kandrík (Slovakia), Christine Karelska (Ukraine), Aliaksei Kazharski (Belarus/Slovakia), Viktoryia Kolchyna (Belarus), Ádám Kolozsi (Hungary), Filip Konopczyński (Poland), Oleksandr Kostryba (Ukraine), Oleksandr Kraiev (Ukraine), Adam Leszczyński (Poland), Paweł Marczewski (Poland), Michał Matlak (Poland), Asya Metodieva (Bulgaria), Adrian Mihaltianu (Romania), Eva Mihočková (Slovakia), Malina Mindrutescu (Romania), Marta Musidłowska (Poland), Mastura Lashkarbekova (Tajikistan/Poland), Iván László Nagy (Hungary), Marco Nemeth (Slovakia), Valeriia Novak (Ukraine), Vitaly Portnikov (Ukraine), Matej Šimalčík (Slovakia), Jiří Schneider (Czechia), Sandra Sirvydyte (Lithuania), Sigita Struberga (Latvia), Zsuzsanna Szabó (Hungary), Dorka Takacsy (Hungary), Bartosz Wieliński (Poland), Volodymyr Yermolenko (Ukraine), Marcin Zaborowski (Poland) and Edit Zgut-Przybylska (Hungary).
About the project
Visegrad Insight is the main Central European analysis and media platform. It generates future policy directions for Europe and transatlantic partners. Established in 2012 by the Res Publica Foundation.
Foresight on European Values and Democratic Security (FEVDS). This project engages CEE civil society leaders in a foresight-driven debate on the future EU policy developments to protect European values and freedoms.
visegradinsight.eu/foresight-European-values
Funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or the European Commission. Neither the European Union nor the granting authority can be held responsible for them.
© Visegrad Insight, Res Publica Foundation
Galczynskiego 5, 00-032 Warszawa, Poland
contact@visegradinsight.eu
www.visegradinsight.eu