Brussels – Elections under the EU’s magnifying glass: EU institutions are running for cover to secure democratic processes in the 27 member states from malicious interference from abroad (especially Russia). Under notable observation are the upcoming legislative elections in Germany and the new round of presidential elections in Romania, given the historic cancellation of the first round last December.
From the EU perspective, the common thread linking these two otherwise distinct consultations is the need to protect its integrity in the face of external threats. Translated: to prevent a foreign power (read: Moscow) from getting in the way. It was precisely the theme of a lengthy hearing held yesterday afternoon (Feb. 17) at the Special Committee on the European Democracy Shield (EUDS) in the European Parliament in Brussels. Representatives of three national agencies for monitoring electoral processes and online disinformation, the EU executive and the External Action Service (EEAS), the Union’s Foreign Office, appeared before MEPs.
The Romania Case
Background: in early December, the Romanian Constitutional Court made the unprecedented decision to annul the first round of the presidential elections held a few days earlier for presumed Kremlin interference in the election campaign, especially on TikTok. The election gave an overwhelming victory to the independent (ultranationalist and pro-Russian) candidate Călin Georgescu, who had an outsized presence and followers on the Chinese platform.

According to judges in Bucharest, however, that vote was to be deemed invalid due to massive outside interference. Thus, Romanians will vote again in the first round on May 4 and possibly in the runoff on the 18th. Meanwhile, the now former president Claus Iohannis resigned to avoid impeachment on February 10, exacerbating even more the political-institutional crisis in the Balkan country. In mid-December, the European Commission opened an investigation (its third) against TikTok for alleged violations of the Digital Services Act (DSA) — the pillar that regulates social networks and online platforms.
Manipulation and Dark Funding
According toa recent report of VIGINUM (the Service of Vigilance and Protection against Foreign Digital Interference, created in France in 2021), networks of external actors have implemented two malicious strategies in the context of the Romanian presidential elections.
On the one hand, TikTok algorithms were manipulated for recommending content to users. In essence, hundreds of thousands of authentic and inauthentic interactions (i.e., made by fake accounts, the notorious bots, allegedly totaling over 300 thousand) artificially – and in a coordinated manner – “inflated” the relevance of determined topics and content, making them go viral and thus favoring Georgescu.
On the other hand, there has been a widespread and opaque operation of funding some 130 influencers (including those with a small-to-medium following) who, consciously or unconsciously, ended up “propagandizing” the positions of the pro-Russian candidate. The absence of transparency about the origin of funding and advertisements allowed the foreign network to move while remaining virtually invisible, directly reaching a gigantic electoral pool.

When one considers that in Romania, TikTok has about 9 million users out of a population of 19 million and that the platform is used as the primary information tool by a substantial portion of this user base, it is clear how the instrumentation of content on this social media outlet may have played a crucial role in tilting the election towards Georgescu’s side.
Also speaking on the Romanian case were members of the Expert Forum (EFOR), a Bucharest-based think tank that, among other things, monitors the spread of disinformation online. In their presentation to the EUDS Committee, EFOR experts emphasized that manipulation of this magnitude requires substantial economic resources: an alarm bell should have rung, for example, when Georgescu claimed he had not incurred campaign expenses.
Another crucial element is the limitations researchers face in accessing the data they need to elaborate on users’ interactions with platform content. Theoretically, the data should be public, but in practice, it is difficult to retrieve, given the lack of cooperation from the owners of the platforms themselves.
The elections in Germany
The latter critical issue, incidentally, was also recently highlighted in the context of the upcoming elections for the German Bundestag. In early February, the Berlin District Court demanded that X, the social networking site owned by Elon Musk, make “immediately available” to researchers a set of data needed to assess the pre-election environment.
In Germany, the polls will open on Sunday (Feb. 23), and the temperature of public debate has soared in recent weeks due to the apparent cross-denomination of the pro-Russian and xenophobic ultra-right of Alternative für Deutschland (AfD). Not only has the ultra-right been adulated repeatedly by Musk (casting heavy doubts on the reliability of the moderation of political content on X), but it has also been deemed an acceptable backing for a parliamentary vote from the Christian Democratic Union (CDU/CSU), the mainstream party that should form the next government.

A representative of BNetzA, the federal agency responsible for, among other things, monitoring disinformation on the Web, reported on the prospects for the integrity of the electoral process in Germany. The recent stress test confirmed that Berlin should be able to hold elections relatively safely, but again, the problem lies on the side of the platforms.
While the DSA provides tools described as “quasi-surgical” to address the risks in question, the point is that companies need to explain more transparently to national and EU authorities how their algorithms work, starting with recommendation systems (to date still resembling black boxes), and political advertisements (shedding more light on where funding comes from and how it is used). Beyond that, they must also ensure the timely removal of fake accounts and give access to data for researchers “before, during, and after elections.”
The efforts of Brussels
Therefore, the main obstacles to effectively countering online disinformation remain full enforcement of regulatory provisions and coordination between national bodies (including intelligence agencies and various governmental structures) and EU bodies. Moreover, representatives of the Community Executive and the EEAS have insisted on cooperation for some time now. The EEAS set up internal structures (like Stratcom) to counter so-called FIMI operations (Foreign Information Manipulation & Interference).

The Commission reiterates that the monitoring of elections is of national competence. At the European level, what can and does exist is a framework for cooperation and coordination among national networks (of which entities such as VIGINUM, EFOR, and BNetzA are a part) to support preparation and risk management. Another crucial point is the speed of platforms in providing the data requested by national or community researchers and agencies and of governments in acting promptly during crises or emergencies of various kinds to safeguard the integrity of democratic processes.
An appropriate response to FIMI actions will likely be increasingly crucial between now and the next few years. Malicious operations from abroad (such as the Doppelganger operation, with which Russia has been attempting to destabilize European states for years) will continue and, indeed, also abetted by the flooding development of artificial intelligence applications, are likely to increase in intensity.
English version by the Translation Service of Withub