In past years, we have witnessed the dawn of a new era in pre-election disinformation and manipulation with the public discourse. Instead of relying on blatant and easily debunked hoaxes and falsehoods, both political and non-political actors have shifted towards more sophisticated manipulation techniques. They exploit divisive issues and the declining trust in state institutions, the media, and even the integrity of information itself. They receive assistance from the political, economic, and social crises stemming from Russia’s aggression against Ukraine, as well as from social media and emerging technologies, including artificial intelligence.
Manipulative tactics also appeared in the autumn elections in Slovakia and Poland. In the Czech Republic, we must prepare for similar developments and anticipate that disinformation agents will seek to exert influence on the European Parliament elections scheduled for June 2024. Therefore, on 1 November 2023 AMO organized a public debate focused on the spread of disinformation narratives during the pre-election debates.
Our guests were Magda Jakubowska, Vice President of the Polish think-tank Res Publica, Nikoleta Nemečkayová, analyst at the Association for International Affairs (AMO), and Kristína Šefčíková, Project Manager at the Prague Security Studies Institute (PSSI). The discussion was moderated by Rikard Jozwiak, editor of RFE/RL in Prague.
Magda Jakubowska explained the main themes of the Polish pre-election debate and their impact on the polarization of society. She particularly highlighted the topic of migration and the use of the state television TVP to spread pro-government narratives. Nikoleta Nemečkayová analyzed the differences and similarities with the Slovak pre-election developments, emphasizing the Russian influence in Slovakia. She stressed that a variety of domestic actors are also spreading pro-Russian disinformation in Slovakia, mainly politicians whose narratives are then adopted and amplified by various disinformation websites.
Kristína Šefčíková then discussed the role of social networks and the motives of various disinformers. She presented the concept of “monetisation of fear”, which refers to the financial motivation of various disinformation actors. Disinformation content has the potential to go viral, generating significant income for its creators through voluntary contributions from readers and viewers, as well as through advertisements on disinformation portals. Disinformation actors are often people in financial need whose motivation is to make money. Their difficult life situations often create a connection with their audience, making it easier for them to tailor content based on an understanding of the audience’s priorities and interests.
The speakers highlighted instances where artificial intelligence manipulated voters in Slovakia and Poland, using AI-generated recordings addressing the main issues of each election. The panelists stressed that the real aim of disinformation is to create chaos in the information space, undermining the public’s trust in information as such. The result is a diminished interest among the public in the world around them. Artificial intelligence plays a crucial role in this phenomenon, as it can generate content faster, cheaper and with enormous intensity. Jakubowska also brought up the role of Facebook/Meta in detecting and removing such fake content. Before the Polish elections, such content was removed within two days. However, as she noted, “once something gets on the internet, it stays there forever.”
The debate was also open to questions from the audience, covering topics such as political advertising transparency, EU regulations, and their effectiveness, as well as the possible increase in the spread of disinformation and other hybrid threats after EU enlargement. Last but not least, the experts answered questions about the (non-)existence of data on the volume of disinformation spread in individual countries, as well as the (non-)existence of short-term solutions in the fight against the spread of disinformation.
Funded by the European Union. The views and opinions expressed represent the views and opinions of the authors and do not necessarily reflect the views and opinions of the European Union or the European Commission. Neither the European Union nor the European Commission can be held responsible.