Carrying out an data protection impact assessment if necessary

07 June 2024

Creating a dataset for the training of an AI system can create a high risk to people’s rights and freedoms. In this case, a data protection impact assessment is mandatory. The CNIL explains how and in which cases it should be realised.

This content is a courtesy translation of the original publication in French. In the event of any inconsistencies between the French version and this English translation, please note that the French version shall prevail.

 

The Data Protection Impact Assessment (PDIA) is an approach that allows to map and assess the risks of a personal data processing and to establish an action plan to reduce them to an acceptable level. This approach, facilitated by the tools provided by the CNIL, is particularly useful to control the risks associated with a processing before it is implemented, but also to ensure their follow-up over time.

In particular, a DPIA makes it possible to carry out:

  • an identification and assessment of the risks for individuals whose data could be collected, by means of an analysis of their likelihood and severity;
     
  • an analysis of the measures enabling individuals to exercise their rights;
     
  • an assessment of people’s control over their data;
     
  • an assessment of the transparency of the data processing for individuals (consent, information, etc.).

The DPIA must be carried out prior to the implementation of the processing and should be changed iteratively as the characteristics of the processing and risk assessment evolve.

 

The realisation of a DPIA for the development of AI systems


AI Risks to consider in a DPIA


Actions to be taken on the basis of the results of the DPIA