How does the spread of disinformation through AI-supported technologies affect our trust in politics and the media? What measures can we take together to combat disinformation? And how could AI be used to combat disinformation effectively? These were the questions addressed by “bidt Perspectives” with the title “Democracy under pressure? How AI and disinformation influence social trust”. The bidt’s cooperation event with the Bavarian State Ministry for Digital Affairs took place on 6 November 2024 at the BAdW. It was moderated by Vera Cornette, Head of Communications & Strategy at the Bavarian State Ministry for Digital Affairs, and Professor Dirk Heckmann, member of the bidt Board of Directors and Chair of Law and Security of Digitalisation at the Technical University of Munich.
“The issue of disinformation is very serious – and should be taken very seriously,” explained Professor Alexander Pretschner, Chairman of the bidt Board of Directors and Professor of Software & Systems Engineering at the Technical University of Munich, in his welcoming address. The aim of disinformation is long-term destabilisation, which works “in particular by destroying trust, namely trust in statements, authorities and systems”. However, in view of the human mind and the human ability to scrutinise content and statements, Pretschner advocated a positive attitude.
We can do something if we all work together – politics, science, business and civil society – if we devise and combine technical, regulatory, educational and awareness-raising mechanisms.
Prof. Dr. Alexander Pretschner, Chairman of the bidt Board of Directors
Trust and the impact of disinformation were also addressed by Professor Lena Frischlich in her keynote speech on “Disinformation and digital democracy”. According to the Professor of Communication and Media Psychology at the University of Southern Denmark in Odense, misinformation can lead to misperceptions and thus “shake trust in democratic institutions and weaken the willingness to participate in democracy” Not everyone is equally susceptible because the effect of disinformation and conspiracy myths is very complex. There are many starting points for countermeasures. Individual media literacy plays a major role, and so-called cognitive immunisation can also help, for example by exposing people to disinformation campaign techniques through playing browser games. In addition, each of us can categorise, comment on and refute online content. Frischlich emphasised:
But that also means that we as journalists, scientists and politicians have to be trustworthy. Trust is not something that can be forced, nor is there a right to trust. We have to earn and maintain trust.
Prof. Dr. Lena Frischlich, Digital Democracy Centre, University of Southern Denmark Odense
Tools against disinformation
The pitches on the topic of “How can we identify disinformation and manipulated content?” by Gudrun Riedl, Head of Editorial at BR24 Digital and Rebekka Weiß, LL.M., Head of Regulatory Policy and Senior Manager Government Affairs at Microsoft Berlin, provided a practical insight. Riedl gave exciting insights into the way BR24 #Faktenfuchs works and explained that “cheap fakes”, i. e. fakes created using simple means and therefore easier to recognise, still dominate at the moment. However, the proportion of AI-generated and professionally created disinformation is increasing. There is currently no tool that can reliably recognise all AI content. Weiß also made it clear in her pitch that tools such as detection software or the labelling of AI using watermarks are important building blocks in the fight against disinformation – but are not a panacea. This makes it all the more important that we all learn how to deal with AI, as AI expertise would be far more powerful.
Do what makes sense for you, but educate yourself! We are definitely on a shared journey to lifelong learning with AI and I can only recommend everyone to start tomorrow at the latest.
Rebekka Weiß, LL.M., Head of Regulatory Policy / Senior Manager Government Affairs Microsoft Berlin
Bavarian alliance declares war on disinformation
Information and communication are increasingly shifting to the digital space, explained State Minister of Digital Affairs, in conversation with Professor Heckmann. So far, however, we have not managed to “install the quality criteria of quality journalism – I would even say – the rules of our democracy there at the same speed. Political profiteers are filling this gap.” This is where the Bavarian Alliance against Disinformation comes in with concrete measures, which was launched by the Bavarian State Ministry for Digital Affairs and the Bavarian State Ministry of the Interior, for Sport and Integration and is scientifically supported by bidt.
It will take a sustained effort by society as a whole if we want to succeed in putting a stop to the political profiteers in the digital space, the enemies of our democracy.
State Minister of Digital Affairs Dr. Fabian Mehring, MdL
Checking sources is just as important in the digital space as it is in the analogue, appealed State Minister of Digital Affairs Dr. Fabian Mehring, MdL. These quality criteria apply to quality journalism just as much as to social posts – we need to make this clear to people.
Bavarian Alliance against Disinformation
A long-term strategy, AI and media expertise are needed
In the panel discussion, experts from politics, research, media and business discussed possible solutions. The panel included State Minister of Digital Affairs Dr. Fabian Mehring, MdL, Professor Andreas Jungherr, Chair of Political Science at the Otto Friedrich University of Bamberg, Andrea Martin, Head of the IBM Watson Center Munich, Karolin Schwarz, author, journalist and trainer, and Lea Thies, Head of the Günter Holland School of Journalism and member of the editorial team at Augsburger Allgemeine.
The panellists agreed that training on disinformation is also necessary in institutions. Media studies are essential from primary school onwards in order to sensitise young people early enough and provide them with tools and skills to achieve a long-term impact.
Disinformation is not a sprint, it's a marathon. This means that we need long-term strategies, we need good, tried and tested strategies and a lot has to happen.
Karolin Schwarz, author, Journalist and coach
Professor Jungherr raised a critical aspect. When it comes to disinformation as a tool in a political context, we should not blame everything on disinformation, but must also listen to the signals from the population. Disinformation “tends to reach people who already have specific convictions.” The most recent election results in the USA have shown that it is not enough to convict someone of lying – the accusation of disinformation made little impression on voters in the US election campaign. Jungherr therefore warned that we too should heed the warning shot and focus more on the specific concerns of the population.
Regulation or self-responsibility of platform operators?
There was also a discussion about whether platform operators should be held more responsible. The problem is that we are fighting against algorithms, argued Thies. These rank private content higher than quality-checked content. The verification of fake news will never have the same impact as the fake news itself, said Thies, “because it is not emotionally charged, because it does not generate engagement.” She therefore called for society to increase the pressure on platform operators, especially in order to protect our children.
However, Mehring, doubted whether regulation is the right approach. He is firmly convinced that the only chance is “to find solutions together with the platform operators. Anything else will not work.” This is because national judgements are difficult if these judgements are of no interest to companies due to their location.
Martin also emphasised that companies have a responsibility towards society in general, “especially when it is such a large, globally active company like IBM. In this respect, we also have an interest in limiting disinformation as much as possible.”
Interview with Professor Lena Frischlich
An update of democracy can only succeed in a joint effort
The event concluded with a keynote speech entitled “Towards an AI democracy: a better policy for Germany?” by Juri Schnöller. His appeal: Democracy must reinvent itself – with AI. If democracy puts AI at its service, this could make a positive contribution to social coexistence – to more agile education, more justice and a reduction in bureaucracy.
Following the official part of the event, guests were able to exchange ideas at the get-together. At the bidt stand, academic speakers provided insights into research topics such as “Regulating disinformation” and “Russian disinformation campaigns abroad”. The Fraunhofer Institute for Applied and Integrated Security (AISEC) offered the opportunity to interactively learn how to recognise audio deepfakes, while the XR HUB Bavaria demonstrated the hidden mechanisms of social media algorithms in a virtual reality environment.
Our conclusion is this: There is no patent remedy – instead, a long-term strategy is required. Your own intellect, i.e. critical thinking and judgement, as well as AI and media skills protect you from disinformation. In short, combating disinformation and strengthening trust in democracy requires us all – politics, science, business and civil society.
Image gallery
More information
Video (IN GERMAN ONLY)
Glossar