Since the European Commission first proposed chat monitoring, there has been disagreement about its possible implementation and the consequences of scanning communications for the purpose of protecting children. Implementing this plan would raise technical, legal, and sociological issues, which are examined in this blog post.
The “Regulation of the European Parliament and of the Council laying down rules on preventing and combating the sexual abuse of children”, colloquially referred to as chat control, has still not been adopted after more than three years of negotiations. This welcome goal is to be pursued through various measures, such as removal orders for known material, but also, as has received particular attention, through the scanning of interpersonal messages. Chat control in its original form from 20221 would have required communication providers to search their users’ data for abusive material involving children – both photos and text messages would have been affected.
Since the first legislative proposal on chat control, there have been various modifications, but ultimately none of them won the support of the majority of Member States. In light of the ongoing efforts to implement the proposal, it is worth taking a closer look at the difficulties of its possible implementation.
Whether, as initially planned, text messages and images should be made mandatory or, as in the current proposal, only voluntary scanning should be standardised, the question of technical implementation remains a challenge. The measures also face legal hurdles, especially as they are directed against all users without specific cause. In addition, such measures could be unpopular with the general population.
Technical implementation
Particularly in the context of encrypted communication, it remains unclear how scanning communication content can be technically implemented. The fundamental problem is as simple to describe as it is difficult to solve: in order for encrypted content to be checked at all, the encryption would have to be removed or circumvented at least at one point. At that moment, however, the content would also be more vulnerable to external, non-governmental attacks.
It is possible to access the data before it is encrypted “on device”, i.e. on the end device, before it is sent. In such a scenario, however, depending on the system, it is questionable to what extent this would even be technically possible in terms of the storage space required.2 There is also a risk that scanning could be more easily evaded in this case.3 This raises concerns that the burden on all users would be disproportionately high, while the individuals sought by law enforcement agencies would be able to circumvent the scanning.
Various technical solutions were also considered in a leaked discussion paper by the European Commission, without ultimately being able to settle on a suitable approach: Compared to the baseline of monitoring unencrypted communications, there is no method that is equally effective, feasible and secure.
The subsequent problem arises from scanning the content once access to it has been gained. Scanning photos and text messages involves different requirements.
Scanning text messages
Scanning text messages was originally intended to be a new way of detecting more complex abusive behaviour towards children, for example in the form of “grooming”, i.e. adults deliberately contacting minors with the intention of abusing them, and forwarding it directly to law enforcement agencies. The EU had planned to implement this using an AI application, which was probably unavoidable. However, the much greater context dependency of text messages compared to images poses a major challenge for AI models5.
According to estimates, an effective system for detecting abusive content in text messages should have a false positive rate of no more than about 0.001 per cent.6 This would affect around 100,000 messages out of the ten billion messages sent daily in the EU, which would then have to be reviewed by human staff.
However, in order to even reach this point, such a system would first have to be developed, which the EU does not expect to happen – it internally estimated a value of between 5 and 10 per cent7 in the case of grooming in 2022. A recent report on voluntary scanning by providers in 2023 and 2024 points out that machine learning methods could achieve a correction classification rate of up to 92 per cent in the best case scenario8. As a result, 8 per cent of all cases would still be misclassified. Even assuming that only chat sessions with minors are scanned for grooming, a significant number of false positives can be expected. How this volume of reports, possibly numbering in the hundreds of millions, could be processed seems questionable. It would also mean that a large number of private, non-criminal chats could potentially be read by moderators, and this knowledge could lead to a chilling effect, a kind of self-censorship due to the knowledge of surveillance9.
Scanning photos
In the case of photos, the question arises as to whether the images should be compared with known representations of abusive material or whether the system should also be designed to recognise previously unknown images.
While the former is already being carried out on a voluntary basis by social media companies and has10 already led to more than 19 million reports in 2024, the search for new abusive material is very challenging given the current state of technology.
How, for example, can a system automatically distinguish between a holiday photo of a child on the beach in a family group and abusive content? Among other things, artificial intelligence would have to be used here, but with the foreseeable consequence that the number of false positives would increase accordingly.11 Age assessment in particular, which is often difficult even for humans, carries a considerable risk of error here: if the age of teenagers is misjudged, for example, legal communication between young people can be wrongly classified as abusive.12
Legal classification
In addition to technical feasibility, the question arises as to how such a measure should be classified legally. According to the Federal Commissioner for Data Protection and Freedom of Information, comprehensive scanning of private messages would violate fundamental rights under the European Charter of Fundamental Rights13, the Basic Law and the principle of proportionality14
This would affect, on the one hand, the right to respect for private and family life (Art. 7 of the EU Charter of Fundamental Rights), the right to data protection (Art. 8 of the EU Charter of Fundamental Rights), the secrecy of telecommunications (Art. 10 (1) of the Basic Law) and the fundamental right to IT systems (as an expression of the general right of personality, Art. 2(1) in conjunction with Art. 1(1) of the Basic Law). As a result of scanning private messages, there are also fears that chilling effects will have an impact on freedom of information and expression (Art. 11 of the EU Charter of Fundamental Rights).15
Chat monitoring is therefore considered by various voices in legal scholarship to be incompatible with fundamental rights.16
Acceptance
In addition to technical feasibility and legal classification, social acceptance of state interference in the privacy of citizens is of central importance. This is because the acceptance of state action, especially in areas sensitive to fundamental rights, is of considerable importance from a political perspective.
To our knowledge, there are no research findings on social acceptance of chat monitoring. However, there is research on the acceptance of other state measures for monitoring communication. This allows us to make informed assumptions about the social acceptance of chat monitoring.
Various studies show that general and unwarranted monitoring of communication behaviour directed at the entire population is much less socially acceptable than targeted measures directed at individual suspects.17 For example, representative surveys in Germany show that data retention, i.e. the general storage of telephone and internet connection data by telecommunications providers, is much more strongly rejected by society than the unnoticed online search of personal computers of suspects. 18
It can therefore be assumed that indiscriminate and mass chat monitoring would also meet with little social acceptance. However, the extent of actual social resistance would also depend heavily on media coverage and public attention to the issue.
Outlook
Despite several attempts, chat monitoring has repeatedly failed to reach agreement in the Council of the EU. Now, the Committee of Permanent Representatives (a sub-body of the Council of Ministers) has agreed on a draft proposal. This means that the proposal can be negotiated in a trilogue procedure (an informal procedure for reaching agreement on legislative proposals between the Parliament, the Council and the Commission). The outcome of this remains to be seen.
The Council Presidency´s proposal19 now under discussion differs significantly from the original version in terms of the points of criticism. The most striking difference is that there is no obligation to scan automatically. However, scanning text messages and images remains possible on a voluntary basis under Regulation (EU) 2021/1232 and can be considered a risk-reducing measure. This provides incentives for providers to continue scanning, albeit on a voluntary basis. The proposal also includes a clause mandating the Commission to examine the necessity and feasibility of a future detection order.
While the move away from mandatory scanning can be seen as a response to criticism, voluntary scanning is not without its problems.20
As of 18 December 2025
Literature
1 European Commission. Proposal for a Regulation of the European Parliament and of the Council laying down rules on the prevention and combating of child sexual abuse. 11 May 2022, https://eur-lex.europa.eu/legal-content/DE/ALL/?uri=CELEX:52022PC0209 [16 December 2025].
2 Bäcker, M./Buermeyer, U. My spy is always with me: Comments on the planned obligation of internet service providers to combat sexualised violence against children (“chat control”). VerfBlog, 11 August 2022, https://verfassungsblog.de/spion-bei-mir/ [16 December 2025], DOI: 10.17176/20220811-181838-0
3 Internet Society. Client-Side Scanning: What It Is and Why It Threatens Trustworthy, Private Communications. August 2022, https://www.internetsociety.org/wp-content/uploads/2020/03/2022-Client-Side-Scanning-Factsheet-EN.pdf [25 November 2025].
4 Leaked European Commission document, available at https://www.politico.eu/wp-content/uploads/2020/09/SKM_C45820090717470-1_new.pdf [25 November 2025].
5 FZI. FZI Position on Chat Control. Kalbfleisch, D./Rill, M./Vugrinic, A. (eds.), 27 September 2024, https://www.fzi.de/wp-content/uploads/2024/05/FZI_Position_on_Chat_Control.pdf [25 November 2025].
6 Anderson, R. Chat Control or Child Protection? 11 October 2022, DOI: 10.48550/arXiv.2210.08958
7 Meineck, S./Reuter, M./Meister, A. Leaked report: EU Commission accepts high error rates in chat control. netzpolitik.org, 29 June 2022, https://netzpolitik.org/2022/geleakter-bericht-eu-kommission-nimmt-hohe-fehlerquoten-bei-chatkontrolle-in-kauf/#2022-06-24_EU-Rat_RAG-Strafverfolgung_Chatkontrolle [25 November 2025].
8 European Commission. REPORT FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT AND THE COUNCIL on the implementation of Regulation (EU) 2021/1232 of the European Parliament and of the Council of 14 July 2021 on a temporary derogation from certain provisions of Directive 2002/58/EC as regards the use of technologies by providers of number-independent interpersonal communications services for the processing of personal and other data for the purpose of combating online child sexual abuse. 27 November 2025, https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:52025DC0740 [12 January 2025]. Cited: Leiva-Bianchi, M./Castillo, N./Astudillo, C. A./Ahumada-Méndez, F. Effectiveness of machine learning methods in detecting grooming: a systematic meta-analytic review. In: Scientific Reports 15, Article number: 9008, 15 March 2025, https://doi.org/10.1038/s41598-024-83003-4
9 Eickstädt, E. Statement to the Committee on Digital Affairs on the CSA Regulation (“chat control”). 23 February 2023, https://www.bundestag.de/resource/blob/935528/Stellungnahme-Eickstaedt.pdf [16 December 2025].
10 Office of Justice Programs – U.S. Department of Justice. CY 2024 Report to the Committees on Appropriations National Centre for Missing and Exploited Children (NCMEC) Transparency. 2025, https://www.missingkids.org/content/dam/missingkids/pdfs/cybertiplinedata2024/OJJDP-NCMEC-Transparency-CY-2024.pdf [25 November 2025].
11 Bäcker, M./Buermeyer, U. My spy is always with me: Comments on the planned obligation of internet service providers to combat sexualised violence against children (“chat control”). VerfBlog, 11 August 2022, https://verfassungsblog.de/spion-bei-mir/ [16 December 2025], DOI: 10.17176/20220811-181838-0
12 FZI. FZI Position on Chat Control. Kalbfleisch, D./Rill, M./Vugrinic, A. (eds.), 27 September 2024, https://www.fzi.de/wp-content/uploads/2024/05/FZI_Position_on_Chat_Control.pdf [25 November 2025].
13 The assessment regarding the violation of the Charter of Fundamental Rights is also shared by the EDPB: European Data Protection Board; European Data Protection Supervisor. Joint Opinion on the Proposal for a Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse. https://www.edpb.europa.eu/system/files/2022-07/edpb_edps_jointopinion_202204_csam_en_0.pdf [15 December 2025].
14 BfDI. The planned EU regulation on preventing and combating child sexual abuse – the so-called “chat control”. bfdi.bund.de, https://www.bfdi.bund.de/DE/Fachthemen/Inhalte/Telemedien/CSA_Verordnung.html [25.11.2025].
15 Tuchtfeld, E. Thank you very much, your mail is safe: How the European Commission wants to abolish digital privacy of correspondence. VerfBlog, 25 May 2022, https://verfassungsblog.de/vielen-dank-ihre-post-ist-unbedenklich/ [16 December 2025], DOI: 10.17176/20220525-182426-0
16 m. w. N.: Bering, J./Windwehr, S. Digital Silver Bullets: Regulatory proposals that violate fundamental rights instead of effective protection for children and young people. VerfBlog, 30 August 2024, https://verfassungsblog.de/chat-kontrolle-effektiver-kinder-und-jugendschutz/ [16 December 2025], DOI: 10.59704/56dd28505f6a47b2; also Colneric, N. Legal opinion commissioned by MEP Patrick Breyer, The Greens/EFA Group in the European Parliament. March 2021, https://www.patrick-breyer.de/wp-content/uploads/2021/03/Legal-Opinion-Screening-for-child-pornography-2021-03-04.pdf [25 November 2025]; dissenting opinion, however, Schwarz, A. Data retention – The odyssey of child protection. In: Kriminalistik, 16 December 2022, 685.
17 Ziller, C./Helbling, M. Public support for state surveillance. In: European Journal of Political Research 60, 994. Antoine, L. Costs, Inconvenience, or Civil Rights? Investigating Determinants of Public Support for Surveillance. In: Surveillance & Society 21, 409. Jäger, F. Security vs. civil liberties: How citizens cope with threat, restriction, and ideology. In: Frontiers in Political Science 4, 1006711.
18 Lüdemann, C./Schlepper, C. The monitored citizen between apathy and protest – An empirical study on resistance to state control. In: Zurawski, N. (ed.). Surveillance practices. Practices of surveillance and control. (Leverkusen: Budrich UniPress, 2011), 119–138. Trüdinger, E.-M./Steckermeier, L. Trusting and controlling? Political trust, information and acceptance of surveillance policies: The case of Germany. In: Government Information Quarterly 34, 426.
19 Leaked text from the European Council Presidency, available at https://cdn.netzpolitik.org/wp-upload/2025/11/2025-11-13_Council_Presidency_COREPER_CSA-R_Partial-mandate_15318.pdf [16 December 2025].
20 Colneric, N. Legal opinion commissioned by MEP Patrick Breyer, The Greens/EFA Group in the European Parliament. 03.2021, 34 https://www.patrick-breyer.de/wp-content/uploads/2021/03/Legal-Opinion-Screening-for-child-pornography-2021-03-04.pdf [25.11.2025]; various statements in Kulbatzki, J. Agreement on chat control. EU Council accepts Danish proposal. In: Tagesspiegel Background, 27 November 2025, https://background.tagesspiegel.de/digitalisierung-und-ki/briefing/eu-rat-nimmt-daenischen-vorschlag-an [16 December 2025].
The blog posts published by the bidt reflect the views of the authors; they do not reflect the position of the institute as a whole.




