| News | Blog | Artificial intelligence in court

Artificial intelligence in court

The use of algorithms and artificial intelligence by the police to avert danger is controversial. The technology is proving extremely useful because the police can analyse vast amounts of data, which humans cannot do manually. However, this very fact leads to profound encroachments on fundamental rights. Now, for the first time, the Federal Constitutional Court has taken an extended position on the matter.


The states of Hamburg and Hesse have created a legal basis for an automated evaluation of police data files. The background is that the police have collected very large amounts of data and it is difficult to evaluate them manually to find clues for future crimes. Hesse therefore uses software from the company Palantir, i.e. an algorithm that can automatically evaluate the police data stocks.

Not only are police suspects entered into the database to find clues for future criminal behaviour of these persons, but structures and networks of criminals are also to be recognised. Hesse only uses an algorithm, but the legal basis is formulated in such a way that artificial intelligence could also be used. Accordingly, the Federal Constitutional Court formulated its judicial standards on the premise that artificial intelligence would be used. In this respect, artificial intelligence was before the highest German court for the first time.

The use of automated data analysis, which may work with the help of artificial intelligence, and which interferes with the right to informational self-determination, was to be measured against the principle of proportionality. The decisive factor was the specific weight of interference that the use of the technology entailed.

The Federal Constitutional Court stated that even the automated evaluation of data with the help of an algorithm is serious, as it can generate new knowledge. The algorithm can already do something that humans, who are faced with these huge amounts of data, cannot. On the one hand, it can make people suspicious who were not suspicious before by identifying networks and structures from the mountains of data. On the other hand, the algorithm can also create a complete picture of a personality by linking the available amounts of data. If artificial intelligence is used, the weight of intervention increases even further, according to the court. By means of pattern recognition, predictive policing would then be possible, i.e. a statement about the future dangerousness of certain persons by means of artificial intelligence. In addition, there would be the well-known problems of the difficult traceability of artificial intelligence and the dangers of discrimination.

The question thus arose as to whether the use of serious algorithms or even artificial intelligence with the help of automated data evaluation in police work can be justified. The court made it clear that the use of technology can be reconciled with our constitution under narrow conditions. These are the narrow conditions that the Federal Constitutional Court has developed in its constant case law on serious secret investigative measures: only for the protection of particularly weighty legal interests, such as the life, limb and freedom of a person, as well as the existence and security of a federal state or a Land, essential infrastructure facilities or other facilities of direct importance to the community. There must also be a sufficiently concrete danger to these particularly important legal interests. There must be facts that at least outline who is planning what and when, which poses a danger to the particularly important legal interests so that the technology can be used.

The ruling only applies to algorithms and artificial intelligence in police work. However, it goes beyond this and can be applied to the use of technology in other areas by state authorities. On the one hand, the specific weight of intervention of algorithms and especially artificial intelligence is important. However, it is equally important that the application of the technology is possible in principle. There must only be sufficiently weighty reasons that can outweigh the respective severity of the encroachment according to the principle of proportionality. The Federal Constitutional Court has thus said “yes, but” to artificial intelligence.

The blog posts published by bidt reflect the views of the authors; they do not reflect the position of the Institute as a whole.