Safeguarding Democracy Against Information Manipulation and Hybrid Threats

Policy brief on resilience against foreign and domestic democratic interference

30 July 2025

Information manipulation and interference by foreign and domestic actors are rising, with recent cases showing local enablers. The EU needs proactive, strategic tools to counter the growing threat.

EXECUTIVE SUMMARY

  • Democratic Information Manipulation and Interference (DIMI), including Foreign Interference (FIMI) and domestic actors, is a growing threat to Europe’s democratic stability.
  • Tactics have evolved from online disinformation to physical sabotage, affecting Romania, Germany, Poland and beyond.
  • AI-powered disinformation, fake accounts, paid influencers, deepfakes and government officials spreading conspiracy theories increase sophistication and reach of these narratives. Recent examples: Romania’s illegal AI-based campaign, Hungary’s misuse of state media, Slovakia’s NGO law.
  • Physical subversion (e.g., sabotage in Germany, bribes, instigated protests) now complements online tactics.
  • DSA offers enforcement tools but is underutilised; platforms like TikTok continue to pose major disinformation risks. It is important to continue to pressure the EU, given the hesitation over its fines against X.
  • A shift from reactive to proactive, cross-institutional responses is urgently needed.
  • European Democracy Shield and new EU Parliament committee show evolving awareness.
  • Vulnerable groups, including women, Roma and other minorities, are heavily targeted and require focused resilience efforts.
  • Recommendations include investment in AI to efficiently counter FIMI; reinforce institutions: equip state agencies with mandates, training and oversight to investigate and prosecute DIMI actors. expand platform accountability through the DSA; and embed civic education in defence planning.

Click here to download the pdf

Key facts

*Information manipulation and interference are increasingly conducted by domestic actors, alongside foreign ones. The EU’s FIMI definition remains too narrow for this reality.

*Romania’s 2024 vote was marred by AI manipulation and illegal TikTok campaigning.

*270+ vehicles in Germany were sabotaged in a plot tied to foreign intelligence.

*Slovakia, Hungary and Georgia face backlash over ‘foreign agent’ NGO laws.

*Disinformation now includes influencers, fakes and deepfakes across CEE.

*Physical subversion is rising: protests, bribes and targeted violence.

*DSA enforcement remains patchy despite strong regulatory tools.

Foresight

AI-driven disinformation and hybrid threats are escalating. Without a strong, coordinated EU response, not only the next European elections but also broader social cohesion and adherence to democratic values are at systemic risk.

INTRODUCTION

The Growing Threat of Disinformation and Interference

Information manipulation and interference have surged to unprecedented levels across Europe, driven not only by foreign actors but increasingly by domestic ones. These expanded origins and tactics now seriously destabilise political systems – as seen in Romania – and threaten fair competition in countries including Germany, Poland and Moldova. DIMI (Democracy Information Manipulation and Interference), encompassing both domestic and foreign actors (FIMI), poses a major threat to societal stability and, amid geopolitical uncertainty, to national and continental security.

FIMI/DIMI tactics have evolved from online disinformation to physical subversion, terrorism and sabotage. For example, Romanian social media manipulation coincided with direct foreign support for a presidential candidate. In Hungary, government-backed political advertising skews elections, while anti-EU narratives from members in Slovakia’s government deepen divisions.

As interference increasingly shifts from virtual to physical realms, a broader range of state and EU institutions must engage in coordinated countermeasures. Mere reaction is insufficient; proactive prevention is essential – especially in already polarised societies where extremist voices gain traction.

Digital platforms have fostered echo chambers that isolate and amplify anti-democratic views. Yet, platform responses remain inadequate. Meta, for instance, has cut back its fact-checking and election teams, and from October 2025 will ban political, electoral, and social issue ads in the EU, citing the legal uncertainty and complexity of the new TTPA regulation

Effectively countering these threats while upholding Western values and social cohesion becomes a matter of national security, and demands strong institutions, clear laws and rigorous enforcement. Reactive approaches have proven inadequate – countermeasures must evolve in sophistication online and strength offline. Protecting democracy and building resilience against interference are now vital security priorities.

FIMI/DIMI in Visegrad+

The broad scope of DIMI is evident across the Visegrad+ countries (Poland, Czechia, Romania, Slovakia and Hungary).

Pro-Russian, Anti-EU Narratives: External actors, particularly Russia, have pushed narratives undermining trust in Romania’s democratic institutions and alliances. These narratives often align with ultra-conservative, anti-liberal factions to destabilise politics. Similar messaging is widespread in Visegrad+, especially in Hungary and Slovakia, where senior officials propagate such views.

Legislative Pressure on NGOs: In July 2024, the European Commission warned Slovakia against adopting a law requiring NGOs receiving foreign funding to label themselves as ‘organisations with foreign support’, citing risks of stigmatising civil society and undermining freedoms of association and expression. Similar, though more severe, legislation exists in Georgia. The law came into effect on 1 June 2025. Although the labelling requirement was ultimately removed, it still imposes additional administrative burdens on Slovak civil society.

Information Manipulation and Disinformation Campaigns: Hungary faces criticism for information control favouring certain political narratives. During election periods, Poland sees populist and divisive disinformation campaigns exploiting societal fractures to influence voters.

Illegal Electoral Campaigns: Social media campaigns using fake accounts and paid influencers to spread false narratives violate electoral finance rules. A notable example is Romania’s 2024 presidential election, where Calin Georgescu orchestrated an illegal campaign.

Physical Sabotage: While Visegrad+ countries have not recently experienced physical sabotage, neighbouring Germany faced coordinated sabotage in early 2025. Over 270 cars had their exhaust pipes blocked with construction foam, accompanied by ‘Be greener’ stickers linked to the German Economics Minister. Initially blamed on radical activists, investigations suggest foreign operatives orchestrated these acts.

Measures taken at the EU Level

Over the past decade, as external and internal malign interference has grown more sophisticated, the EU’s response to disinformation and hybrid threats has mostly been reactive, often lagging behind attackers.

European Commission

The EU began addressing information manipulation in 2015 with the East Stratcom Task Force, aimed at helping civic actors identify fake news. However, early efforts were limited by underfunding and lack of enforcement powers.

With the rise of blatant foreign interference, the EU expanded its approach to include FIMI. The 2022 Digital Services Act (DSA) introduced potential sanctions for online platforms failing to curb misinformation, giving the EU important leverage – but enforcement remains limited. TikTok, under multiple investigations, illustrates this gap, as no sanctions have yet been applied. Similarly, the European Commission has delayed its investigation into platform X for allegedly violating the DSA, likely awaiting progress in EU-US trade talks. The probe, which was expected before the summer break, has now been postponed. While the Commission insists enforcement is independent, the timing raises questions of global revenue if found in breach.

Beyond the DSA, EU initiatives focus on detection and information sharing through tools like the EEAS’s Rapid Alert System (RAS) and FIMI-ISAC. The Commission also publishes annual Rule of Law reports and launched the European Democracy Shield in 2024 to tackle threats to elections, media freedom, and disinformation.

European Council

The Council lacks a dedicated strategic communication unit but includes such functions within its Directorate-General for Communication and Information (DG COMM). Counter-disinformation efforts are mainly coordinated via the EEAS.

European Parliament

In December 2024, Parliament formed a 33-member committee with a 12-month mandate to address FIMI and DIMI, with a first report expected later in 2025. Earlier, the 2022 INGE committee called for sanctions against disinformation spreaders, a unified EU foreign interference strategy, and stronger media literacy. While the DSA partly responds to these calls, many recommendations remain unfulfilled.

Redefining malign interference

Since the adoption of the DSA and the two INGE reports, three new elements have begun to redefine the nature of information manipulation and interference. These developments are not yet – or not sufficiently – reflected in the EU’s current approach.

1. Domestic actors driving manipulation

Interference by domestic actors who manipulate information and act against democratic principles has significantly increased. Foreign influence is now often channelled through these actors, amplifying their actions and statements. Extremist or anti-democratic politicians – whether in opposition or in power – as well as mainstream journalists, influencers and celebrities, have joined the information manipulation space. They exploit their platforms and notoriety to spread falsehoods and incite destabilising action.

To respond effectively, EU action must recognise that both foreign and domestic actors are integral to today’s threat landscape.

2. AI-enhanced disinformation

Artificial intelligence has rapidly increased the reach, speed and perceived credibility of misinformation. A 2024 study showed that 86% of people worldwide had encountered fake news – and AI-generated content was often seen as more trustworthy than human-generated falsehoods.

AI-generated materials now flood social media. During election campaigns, they are used to create and spread propaganda. For instance, Hungary’s Jobbik Party has used AI-generated images to portray immigrants negatively, aiming to influence public opinion through emotionally charged content. In Romania, AI was deployed to fabricate videos of candidates in compromising situations or making false statements.

Detection tools remain behind the curve, and countermeasures have yet to catch up with the evolving threat.

3. From online to offline sabotage

Malign interference has moved beyond digital manipulation to include real-world subversion. Foreign-funded actors have paid citizens to oppose candidates or amplify targeted narratives, instigated protests and disruptive actions, and orchestrated violence or sabotage. These activities represent a shift to a more extreme and dangerous form of interference.

Such physical acts are more visible and fall more clearly under criminal law – making them easier to prosecute where political will and institutional capacity exist. At the same time, their spectacle further mobilises extremist supporters and deepens societal polarisation.

Lessons learned

1. Indifference to Domestic Interference Is No Longer Tenable

The impact of information manipulation and interference – both foreign and domestic – on the stability and integrity of a state can be immense, as Romania’s recent experience shows. Failing to act against fake accounts inciting unrest and allowing extremist demonstrations to escalate can quickly trigger an uncontrollable chain reaction.

2. State institutions are central to resilience

State institutions play a decisive role in countering DIMI. Proactive monitoring of suspicious online and offline activity is essential to identify threats, prevent escalation and hold malign actors accountable. Effective mitigation depends on interagency cooperation and real-time information-sharing. A coordinated European focus on DIMI ‘hotspots’ increases the impact of national-level countermeasures.

3. AI is a double-edged sword – and a strategic asset

Detection tools must evolve as fast as the threat landscape. Artificial intelligence can help identify malign operations and attribute them correctly – a crucial step for applying the right legal and policy tools. AI can also help differentiate between domestic and foreign sources of manipulation.

The private sector is advancing fast:

  • Facticity.AI, developed by Singapore’s AI Seer, verifies text and video claims with 92% accuracy. During a presidential debate, it checked 250 claims in real time.
  • Reality Defender flags deepfakes in live video calls, offering fraud prevention potential.
  • Graphika uses AI to map online communities and expose coordinated disinformation campaigns.

The EU’s Horizon Europe programme co-funds vera.ai, which provides journalists and human rights investigators with AI-powered tools for deepfake detection, image verification and network analysis.

4. Laws Must Keep Pace With New Threats

Legislation must be updated to address evolving DIMI tactics, balancing individual freedoms with the need to protect national and EU-wide security. While over-legislation should be avoided, better implementation of existing EU rules at the national level would strengthen the legal toolbox.

Key instruments already exist, including:

  • The Digital Services Act (DSA)
  • The European Media Freedom Act
  • The Code of Practice on Disinformation
  • The Audiovisual Media Services Directive

5. Civil Society Is Vital, But No Longer the Front Line

While CSOs remain important in identifying and reporting DIMI/FIMI and strengthening societal resilience, they are no longer the first line of defence. The increasing subversiveness of threats requires well-equipped state institutions to take the lead – with proper mandates, training and resources. However, cooperation with civil society and independent media is essential to reinforce democratic resilience and ensure political accountability.

6. Joint Action Is Key – Across All Levels

A coordinated response involving national institutions, civic actors and European institutions is essential. EU regulations empower national authorities to exert pressure on platforms, but only national actors can implement concrete legal and security responses. Their collaboration multiplies the impact of each intervention.

7. Protect the Most Vulnerable – or Risk Societal Collapse

Certain groups are more exposed to manipulation due to limited digital access, low media literacy, mistrust in institutions or existing marginalisation.  Women – especially those in public life or from minority backgrounds – are increasingly targeted by gendered disinformation, which seeks to silence, intimidate and discourage their participation in democratic processes. This tactic not only amplifies existing inequalities but also weakens democratic resilience as a whole. These vulnerable groups also include:

  • Roma communities
  • Residents of authoritarian states
  • Elderly populations
  • Ethnic and linguistic minorities
  • Low-income urban populations
  • Teenagers and youth

Among these, the Roma community remains the most vulnerable. Their longstanding exclusion has fuelled anti-system sentiment, which malign actors readily exploit. As demonstrated in Romania’s recent elections – and confirmed by Roma for Democracy/Roma for Europe – boosting the resilience of these groups is critical to democratic stability.

There is no one-size-fits-all solution. Targeted, tailored actions are essential.

Recommendations:

  • Maximise the Use of AI.  AI is a powerful tool for both malign actors and defenders against disinformation. The EU should shift towards leveraging AI for detection, monitoring and countering DIMI, ensuring that AI research is ethical and cross-sector collaboration is promoted. Initiatives like Google’s Jigsaw and AI’s potential in information verification should be embraced, with a focus on human oversight and algorithm refinement.
  • Strong Institutions, Adaptive Instruments and Rules. Countering DIMI requires robust institutions with the capacity to trace and sanction malign actors, even those using proxies. The EU should promote cross-border cooperation, support civil society and independent media and ensure the enforcement of regulations such as the Digital Services Act (DSA), while increasing political will and resources to protect democratic institutions.
  • Societal Resilience. To combat the polarisation and extremism amplified by FDIMI, the EU must prioritise education, media literacy and critical thinking to rebuild public trust. Resilience relies on strong institutions and an active civil society. Member states should integrate digital education into broader security strategies, empowering CSOs and media as key defenders of democracy.
  • Strategic Communication. The EU must improve its strategic communication to counter disinformation effectively. Governments, CSOs and independent media should develop modern communication strategies and work cross-border to share information and respond jointly. The European Democracy Package must prioritise support for civil society and media in dismantling disinformation.
  • Regulate Malign Activity, Not Narratives. Rather than regulating specific disinformation narratives, the focus should be on curbing harmful activities such as bot armies and algorithmic manipulation. The EU should enhance AI regulation and ensure better transparency around algorithms by adapting the criminal code to address these activities.
  • Bounty System for Researchers. To incentivise research, a bounty system could reward researchers who submit findings used to support actions against disinformation. This system would encourage further research and provide tangible benefits for researchers contributing to policy efforts.
  • Shared Funding Scheme for NGOs and CSOs. A shared funding scheme could ease the financial burden on CSOs working on disinformation, covering costs such as IT, accounting and legal fees. This would allow these organisations to focus on their core missions and expand their impact.
  • Allow CSOs to Partner with Private Businesses. By adapting grant structures to allow CSOs to collaborate with private businesses, the EU could foster more integrated and diverse approaches to combating disinformation. These partnerships could support projects with multiple objectives, broadening the scope of intervention.
  • Empower EU Representatives in Member States. EU representatives should be granted new powers to act as local watchdogs for EU values. They could lead committees, alongside local CSOs, to determine when cases of disinformation should be pursued, leveraging local expertise in the enforcement process.
  • Early Warning System for Disinformation. An early warning system for disinformation, similar to weather models, should be developed across the EU. Building on existing systems, such as in Lithuania, this would improve threat awareness and enable proactive responses.
  • Enhanced Political Exchange. Breaking silos between the executive branch, CSOs, businesses, influencers and EU-level and national agencies is crucial for effective disinformation responses. Enhanced political exchange would foster collaboration and improve the EU’s overall strategy.
  • Social Responsibility of VLOPs. Very Large Online Platforms (VLOPs), like actors in the defence sector, have seen immense growth and profits, often amplified by their central role in the digital information ecosystem. With this influence comes responsibility. VLOPs should be required to contribute financially to civil society organisations working to counter disinformation and support democratic resilience. This could take the form of mandatory contributions or structured partnerships, ensuring they play an active role not only in compliance but in defence of democratic values.

Authors

Alina Inayeh

Contributors

Team:

Galan Dall, Magda Jakubowska, Staś Kaleta, Tomasz Kasprowicz, Anna Kuczyńska, Natalia Kurpiewska, Magdalena Przedmojska, Wojciech Przybylski, Albin Sybera, Luca Soltész and Simon Xiao.

Fellows:

Radu Albu-Comanescu (Romania), Merili Arjakas (Estonia), Alina Bârgăoanu (Romania), Bohdan Bernatskyi (Ukraine), Marysia Ciupka (Poland), Spasimir Domaradzki (Poland/Bulgaria), Martin Ehl (Czechia), Artur Nowak-Far (Poland), Jan Farfał (Poland), Oksana Forostyna (Ukraine), Philipp Fritz (Germany), Ognyan Georgiev (Bulgaria), Marzenna Guz-Vetter (Poland), Jarosław Gwizdak (Poland), Pavel Havlicek (Czechia), Alina Inayeh (Romania), Ruslanas Iržikevičius (Lithuania), Krzysztof Izdebski (Poland), Staś Kaleta (United Kingdom), Matej Kandrík (Slovakia), Christine Karelska (Ukraine), Aliaksei Kazharski (Belarus/Slovakia), Viktoryia Kolchyna (Belarus), Ádám Kolozsi (Hungary),  Filip Konopczyński (Poland), Oleksandr Kostryba (Ukraine), Oleksandr Kraiev (Ukraine),  Adam Leszczyński (Poland), Paweł Marczewski (Poland), Michał Matlak (Poland), Asya Metodieva (Bulgaria), Adrian Mihaltianu (Romania), Eva Mihočková (Slovakia), Malina Mindrutescu (Romania),  Marta Musidłowska (Poland), Mastura Lashkarbekova (Tajikistan/Poland), Iván László Nagy (Hungary), Marco Nemeth (Slovakia), Valeriia Novak (Ukraine), Vitaly Portnikov (Ukraine),  Matej Šimalčík (Slovakia), Jiří Schneider (Czechia), Sandra Sirvydyte (Lithuania), Sigita Struberga (Latvia), Zsuzsanna Szabó (Hungary), Dorka Takacsy (Hungary), Bartosz Wieliński (Poland), Volodymyr Yermolenko (Ukraine), Marcin Zaborowski (Poland) and Edit Zgut-Przybylska (Hungary).

About the project

Visegrad Insight is the main Central European analysis and media platform. It generates future policy directions for Europe and transatlantic partners. Established in 2012 by the Res Publica Foundation.

Foresight on European Values and Democratic Security (FEVDS). This project engages CEE civil society leaders in a foresight-driven debate on the future EU policy developments to protect European values and freedoms.
visegradinsight.eu/foresight-European-values

Funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or the European Commission. Neither the European Union nor the granting authority can be held responsible for them.

© Visegrad Insight, Res Publica Foundation
Galczynskiego 5, 00-032 Warszawa, Poland
contact@visegradinsight.eu
www.visegradinsight.eu

Strategic Foresight by Visegrad Insight

In-house programme dedicated to analysing impactful trends, mapping out potential scenarios and generating weekly and monthly foresights.

Newsletter

Weekly updates with our latest articles and the editorial commentary.