Criticism of platforms instead of a learning curve
One example is provided by his observation of the current political shifts that are strongly expressed in digital media. “A very obvious but, in my view, wrong step is to say that it was the digital media,” says Jungherr. To assume that social platforms with their algorithms only need to be switched off so that people vote less for right-wing or left-wing populists and traditional parties get rid of problems is too short-sighted.
Jungherr looks at the mechanisms behind this: at newer parties that train their supporters to be loud in online media and others that do not. “This is of course a difficult task for the established parties – to say: ‘How do we manage this cultural change in our organisations?’”. Learning to be active online, to bring top candidates to the fore -candidates who are then also supported by the grassroots in the digital space is not as easy to implement as criticising technical mechanisms and American or Chinese companies.
Claim to an overall view
Jungherr also aims for comprehensive understanding when it comes to the use of digital tools, for example in election campaigns. There are studies that only focus on the tools used in political campaigns and analyse how which digital tools are used. However, the problem with this is that the effects are overestimated. The effects are often only assumed and fears are currently being articulated. “People then underestimate the braking effect that our psychology has. This is because it is embedded in broader information environments and our social environment,” says Jungherr.
On the other hand, there are studies that only look at the effect side. The danger here is “that we underestimate the extent to which digital media change the practice of politics, for example”. This is because measurable effects are small. Studies have great difficulty in empirically mapping long-term effects. “Above all, you can’t see how the practice of politics or the practice of the political public and news usage have actually changed,” says Jungherr. Even if the change cannot be translated into measurable effects, this does not mean that there is no change and everything continues as before. His aim is to incorporate these changes anyway.
For me, it is always important to try to take both perspectives into account in different publications and projects in order to provide a more balanced overall picture.
Prof. Dr. Andreas Jungherr To the profile
“Very poor elite discipline”
When it comes to the question of how some content can spread very widely on the internet, Jungherr also rejects hasty attributions. “I think it’s incredibly important that in the debate about the role of digital media today, we don’t just look at the companies and say: ‘That’s the algorithm’, but also recognise it: A lot of the communicative problems we see with the dissemination of problematic content arise because we have a very poor elite discipline in not disseminating this content.” On the contrary, Jungherr even states that even if traditional media or politicians are no longer able to keep content out of the public sphere in a gatekeeper function, they are still important as hubs and amplifiers for wide reach. Studies from the USA have shown this. They therefore play a role in determining the degree of visibility of content.
We have political elites who deliberately amplify unreliable or false content when they believe it will benefit them – they are part of the problem.
Prof. Dr. Andreas Jungherr To the profile
Doubts about the large-scale jeopardising effect of disinformation
At a time when the European Commission has declared war on disinformation in order to protect democratic systems, surveys show that fake news makes citizens very uncomfortable and fact-checkers are endeavouring to clarify the situation, Jungherr remains critical. “Just because disinformation exists doesn’t mean it has the dangerous effect attributed to it,” he says. He defines disinformation as dangerous when it convinces people en masse of things that are not only untrue, but also against their self-understood interests. “That means I have to convince someone to act against their own interests. And that’s very, very difficult.” If it were easy, for example, there would be less difficulty in gaining broad support to take action against climate change. “I know of little empirical evidence to suggest that disinformation can have this convincing effect on the masses.”
In terms of quantity, disinformation in election campaigns or political communication would also take a back seat to a larger amount of correct information. There would need to be arguments as to why disinformation, of all things, which is seen less frequently and intensively, should have an enormous persuasive effect. In addition, a number of groups that see unreliable sources would already have convictions in this direction anyway – “not ideal in terms of democratic theory”, but it would not change their opinion.
Information about disinformation is also unsettling
The findings do not end there. Jungherr goes through the topic until he reaches the psychological effect and a politically serious consequence: mistrust in the system. If people see unreliable content in an information space, are warned about false reports and read about foreign influence, they would lose trust in the system and their own power of judgement.
We know from empirical studies that people who are made aware of the occurrence of disinformation are better at recognising it. Unfortunately, however, they are then also more likely to misjudge correct information as false. This means that we are creating an attitude of general mistrust.
Prof. Dr. Andreas Jungherr To the profile
If mistrust of the system increases and people withdraw from communication channels, then politicians or authorities will also lose opportunities to reach people in emergency situations, for example, warns Jungherr. This could potentially damage and destabilise society more than would be the case with fake news with limited reach and persuasive power.
bidt research projects on AI
The latest development in the digital transformation of political communication is the influence of artificial intelligence. Jungherr is researching this in two projects at the bidt – one on the use of generative AI in election campaigns and the other on the influence of authoritarian regimes on language models and their users.
The first project is “almost classic” in that it asks how AI is used by parties and political actors. “What interests me most is how parties and political actors actually learn about new technology and communication channels How is this learning passed on within parties? How does an organisation deal with technological change?” says the bidt director.
The second project, led by Professor Florian Töpfl from Passau, is investigating whether it makes a difference if an AI model comes from an authoritarian regime. Answers to questions such as: Does an authoritarian AI reproduce narratives, values or propaganda of the regime? And if so, at what point does the influence arise? Could it be that the AI even generates correct answers, but that these are then changed or filtered by an intermediary moderating intervention?
In addition to his work at the bidt, Jungherr also leads interdisciplinary research projects at the University of Bamberg. One project, which is part of the EU’s Horizon Programme, is investigating how people react to AI-supported deliberations. One finding from a study at the beginning of the year: “We asked people about their willingness to participate when AI is used in deliberative formats. And unfortunately, we see that the willingness to participate decreases – even if it is only used moderately and merely ensures that users get the right dialogue partner.” One point that Jungherr emphasises: AI scepticism is drawing a new dividing line in the population. “And this is a different divide to the socio-demographic and socio-economic divide,” says the scientist.
to the bidt research projects
Interdisciplinary with a specialism in digitalisation
Politics, political sociology, political psychology, communication sciences and big data – Jungherr’s professional background and positions at several universities in Germany and a visiting professorship in Zurich reflect all of these areas. His digital interest was already evident during his studies. He began experimenting with Twitter analyses at the end of the noughties during his master’s thesis. Back then, he also focussed on the topic of digital election campaigns, which is still represented in his work today. And what could happen next?
There is already something that he would like to pursue more intensively: “What I find very, very exciting is the question of why we as a society emphasise certain risks and ignore others,” says Jungherr in the bidt interview. Digital media, the digital economy, AI – all of these are discussed very negatively in Germany. “But at the same time, we lose sight of the fact that risks also arise when we close our minds to things.” For example, data-based business models have not even become big in Germany. “And that’s why we now have a subsequent problem with AI.” He is interested in understanding why societies prioritise certain risks, some of which are exaggerated, while ignoring other risks that are associated with both the loss of prosperity and the future. A very big question, as he says himself.




