| Phenomena | Digital disinformation to influence political elections

Knots in the knowledge map

Disziplin

Communication studies
Political science

Digital disinformation to influence political elections

Reading time: 6 min.

Digital disinformation to influence political elections is deliberately and demonstrably false digital news reports that are disseminated in the run-up to political elections in order to influence the outcome of the election. While disinformation is a general phenomenon (see fake news), it poses a major challenge in the run-up to elections in particular. The outcome of democratic elections has an enormous influence on political decisions in the coming years. At the same time, governments and their actions during election campaigns are usually under close scrutiny. In this context, there is the possibility that disinformation could be particularly effective and thus increase mistrust of politics. At the very least, there is great concern among the population that they are able to influence elections (cf. Bernhard et al. 2024).

Digital disinformation in the context of political elections can be directed against politicians, parties or even individual institutions (such as the government, the opposition, courts, media or even the European Union), against individual projects and laws or other public figures. The originators of such disinformation can be both state and non-state actors from Germany and abroad. Digital disinformation can quickly gain a wide reach through its dissemination on social media or on supposedly reputable online news sites. In addition, AI-generated content such as synthetic images, videos and voice recordings, which have already been used several times in election campaigns, make it easier to personalise, emotionalise and produce disinformation (cf. Garimella/Chauchard 2024; Châtelet 2024). In this way, targeted disinformation campaigns can weaken trust in certain individuals, but also trust in politics, the media and facts in general.

With regard to domestic actors, digital disinformation can be spread in the run-up to elections, particularly by parties or party supporters. In the case of foreign actors, the term FIMI (Foreign Information Manipulation and Interference) refers in particular to the influence of Russia and China. According to information from the authorities, Russian disinformation campaigns in particular have repeatedly attempted to exert a pro-Russian influence on elections in Europe (see European External Action Service 2024; Federal Ministry of the Interior and for Home Affairs 2024). However, it is often not possible to conclusively clarify who exactly is behind digital disinformation aimed at influencing elections, as some actors successfully conceal their (financial and political) connections to foreign states. Increasingly, foreign actors are also operating through domestic proxies, which makes it even more difficult to identify the actors (see Allen 2024). In the context of digital policy, there are various approaches to minimising the discussed influence of disinformation on elections. At European level, the Digital Services Act (DSA) adopted in 2022 is characterised above all by the fact that it requires operators of social media platforms to take stronger action against illegal and harmful content. As a result, different operators developed different strategies such as debunking, labelling and deleting disinformation. However, the largest social media platforms only took action against 55 per cent of all content containing disinformation in the run-up to the 2024 European elections (Fundacion Maldita 2024). In addition, raising awareness of disinformation (prebunking) and promoting media literacy are other strategies for reducing the influence of disinformation on elections.

Comparability with analogue phenomena

There is also disinformation in the analogue world that is used to influence elections. For example, newspaper articles or election posters can be deliberately falsified and spread false information in order to change the outcome of elections at home or abroad (cf. Schinkels 2017). However, the possibilities for spreading disinformation in the analogue sphere are much more limited than in the digital sphere, as the digital sphere is characterised in particular by simplified obfuscation and ubiquitous availability.

In social media, it is easy to create a supposedly reputable profile without revealing your own identity (simplified concealment). This makes it difficult for users to find out to what extent a source of information is reputable or spreading disinformation. At the same time, content can be accessed from almost anywhere (ubiquitous availability), meaning that foreign actors can reach domestic users quickly and easily. In addition to these two main differences, the digital realm is also characterised by increased modifiability and lossless duplication of content. In social media, certain social groups can be reached with customised content that can be easily adapted for each individual group. Individual content can also be distributed thousands or even millions of times without incurring any costs. Overall, it is therefore much easier to spread disinformation to influence elections in the digital than in the analogue sphere.

Social relevance

However, despite some spectacular revelations (see SGDSN 2024), it is still questionable to what extent there is a direct link between the spread of disinformation and a change in voting behaviour. This is because digital disinformation initially creates a climate of uncertainty that makes it less likely that the information received will be believed. Any digital information is therefore initially viewed with scepticism, without people immediately changing their political opinion and voting behaviour (cf. Allcott/Gentzkow 2017; Budak et al. 2024).

Nevertheless, digital disinformation is highly relevant to society when it comes to influencing political elections. Even if people do not immediately change their opinion as a result of disinformation, such disinformation can contribute to the polarisation of a society by reinforcing existing differences of opinion within society (cf. Chen et al. 2021). At the same time, digital disinformation can undermine trust in democratic institutions and elections if, for example, legitimate elections are called into question by false information (cf. Mauk/Grömping 2024). Overall, it can therefore be said that digital disinformation poses a particularly great challenge to democracy in the context of elections.

Sources

  • Allcott, H./Gentzkow, M. (2017). Social Media and Fake News in the 2016 Election. In: Journal of Economic Perspectives 31(2), 211–236.
  • Allen, D. (2024). Anticipating the Storm: Mapping Digital Threats to the 2024 European Parliament Elections. Democracy Reporting International.
  • Bernhard, L. et al. (2024). Verunsicherte Öffentlichkeit. Superwahljahr 2024: Sorgen in Deutschland und den USA wegen Desinformationen. Bertelsmann Stiftung, 44–50.
  • Budak, C. et al. (2024). Misunderstanding the harms of online misinformation. In: Nature 630, 45–53.
  • Bundesministerium des Innern und für Heimat (2024). Verfassungsschutzbericht 2023. 313–323.
  • Châtelet, V. (2024). Far-right parties employed generative AI ahead of European Parliament elections. Digital Forensic Research Lab.
  • Chen, E. et al. (2021). COVID-19 misinformation and the 2020 U.S. presidential election. In: Harvard Kennedy School (HKS) Misinformation Review.
  • European Union (2024). 2nd EEAS Report on Foreign Information Manipulation and Interference Threats, Europäischer Auswärtiger Dienst 2024, 5–6. https://www.eeas.europa.eu/sites/default/files/documents/2024/EEAS-2nd-Report%20on%20FIMI%20Threats-January-2024_0.pdf [11.07.2024]
  • Fundacion Maldita (2024). Platform Response to Disinformation during the EU Election 2024.
  • Garimella, K./Chauchard, S. (2024). How prevalent is AI misinformation? What our studies in India show so far. In: Nature 630, 32–34.
  • Mauk, M./Grömping, M. (2024). Online Disinformation Predicts Inaccurate Beliefs About Election Fairness Among Both Winners and Losers. In: Comparative Political Studies 57(6), 965–998.
  • Schinkels, P. (2017). Der Troll im Wahlplakat. Correctiv Hintergrund.
  • SGDSN (Secrétariat général de la défense et de la sécurité nationale) (2024). PORTAL KOMBAT: A structured and coordinated pro-Russian propaganda network. Technical Report.