The rights of users of public services in the face of algorithms and AI systems: points of vigilance and recommendations from the Defender of Rights

Date
13 Nov 2024
Open link

Considering the growing number of individual administrative decisions taken on the basis of results delivered by algorithms or AI systems, the Defender of Rights is concerned about the risks that this algorithmization of public services poses for users' rights. She presents several recommendations so that the guarantees provided for by law are fully implemented.

A massive algorithmization of public services and guarantees provided for by law

In her report "Algorithms, AI systems and public services: what rights for users?", the Defender of Rights, responsible for defending the rights and freedoms of users of public services, examines the effectiveness of two particularly important guarantees to ensure respect for these rights: human intervention in decision-making and control of systems, and the requirement for transparency with regard to users.

In order to automate, improve or accelerate certain procedures, the administration is increasingly using algorithms or AI systems. Individual administrative decisions taken by administrations can thus be partially automated - if human intervention is planned - or fully automated. In return, European (GDPR) and national legislation (Data Protection Act and Code of Relations between the Public and the Administration, CRPA) provide guarantees to ensure that the rights of users of public services are respected.

"Partially automated" decisions: human intervention as a guarantee of respect for user rights

When an administrative decision is said to be "partially automated", a public official must take a positive, concrete and significant action based on or alongside the result generated by the algorithm, in decision-making.

The Defender of Rights notes, particularly on the basis of the complaints she receives, that this intervention sometimes proves to be non-existent – ​​as is the case for the Affelnet high school allocation procedure, or Parcoursup – and sometimes inconsistent or biased, when the people involved in individual decision-making tend to endorse the results produced by the system without questioning them.

Faced with these limitations, and their effects on the rights of users of public services, the Defender of Rights recommends issuing mandatory criteria and operating procedures to more precisely qualify the nature of the human intervention required.

The duty of transparency: meeting the information requirements and the need for explanation with regard to the user

When it has made a decision based on algorithmic processing, a public service must provide a certain amount of information to the user concerned but also to the public. This legal requirement for transparency, which stems from a constitutional principle, must make it possible to understand certain decisions in order to be able to debate them, or even usefully contest them.

The Defender of Rights notes that information obligations, although essential, are sometimes poorly or poorly respected and makes several recommendations, including:

  • establish a right to an explanation of individual administrative decisions that are fully or partially automated;
  • introduce a penalty in the event of non-compliance with the obligation to publish online information relating to the systems used;
  • determine whether machine learning systems can be used when they base an individual administrative decision with regard to the CRPA's obligations on transparency;
  • involve users of the public service at all levels;
  • support research in this area as well as associative and collective projects aimed at promoting understanding and public debate around these subjects.