Call for papers (IT, Social Sciences, Design and Law) - Privacy Research Day, July 1st 2025, Paris

12 February 2025

For the fourth edition of the Privacy Research Day, the CNIL is inviting the international academic community to submit communications in IT, social sciences and law on the theme of privacy and personal data protection.

The Privacy Research Day: a major academic conference on data protection

The Privacy Research Day is an opportunity to build bridges between researchers from all disciplines and regulators. During the conference, experts from different fields will present their work and discuss its implications for regulation - and vice versa.

This interdisciplinary day is aimed at a wide audience familiar with privacy and data protection. Our objective is to create an exchange between legal experts, IT specialists, designers and social science researchers.

This event will also enable participants to discover innovative research: new challenges, new vulnerabilities and new solutions will be presented.

At last, the Privacy Research Day is an opportunity to create lasting partnerships between academia, the CNIL and other public bodies.

Several types of contributions can be submitted:

  • recently published research articles in international peer-reviewed journals or conferences;
  • for contributions in social sciences, results of surveys in progress;
  • research projects at an advanced stage or nearing completion.

Call for papers

Paper proposals will be evaluated by CNIL experts, on the basis of an assessment of the scientific quality of submissions, their relevance to CNIL’s work, and the need to cover a diverse range of topics representing a variety of approaches. Contributions may be submitted in English or French, but presentations on the day of the conference should ideally be in English.
 
Contributions - in English or French - can be submitted on this page:
 

CNIL-Inria award in Computer science

For the 9th year, the CNIL-Inria Prize will be awarded during Privacy Research Day, in recognition of an academic publication in computer science by a European laboratory. The call for applications, separate from the call for contributions to Privacy Research Day, will be open until February 21st, 2025.

► Launch of 9th edition of CNIL-Inria Privacy Award

CNIL-EHESS awards in Social sciences

For the first time, following their partnership agreement, the CNIL and the School for Advanced Studies in the Social Sciences (EHESS) will be awarding a prize for the best paper in social sciences submitted to Privacy Research Day, which will be open until March 31, 2025. You can apply for the prize directly via the Privacy Research Day submission form.

► Launch of the CNIL-EHESS Award in Social Science

Members of the jury :

Mehdi Arfaoui (CNIL/EHESS) ; Christelle Aubert-Hassouni (Paris School of Business) ; Céline Borelle (EHESS/SENSE-Orange Labs) ; Yann Bruna (Université Paris-Nanterre) ; Antoine Courmont (Université Gustave Eiffel) ; Gaël Depoorter (Avignon Université) ; Yann Ferguson (INRIA) ; Camille Girard-Chanudet (EHESS/CNAM) ; Francesca Musiani (CIS/CNRS) ; Cécile Méadel (Panthéon-Assas Université) ; Kevin Mellet (Sciences Po) ; Valérie Peugeot (Sciences Po) ; Julien Rossi (Université Paris 8) ; Camille Roth (EHESS/CNRS) ; Ido Sivan-Sevilla (University of Maryland) ; Luke Stark (FIMS/Werster University).

Main themes

This call for papers offers the academic community an indicative, non-exhaustive list of themes. Proposals for papers on other themes related to privacy and personal data protection are welcome.

"AI through the prism of ..."

This theme will enable us to address AI from at least 3 angles:

  1. Personal data: This angle looks at the impact of AI on people's rights, both when people's personal data is processed and when it is not. Research in computer science in particular will help us to identify situations where personal data has been used to train an AI as well as the phenomena of regurgitation of personal data by new types of attack. Research in law may help us qualifying the legal status of certain AI-generated content (voice imitations, deepfakes, etc.).
  2. Attacks: This section looks at computer attacks enabled, enhanced or amplified by AI. In particular, we'll be looking at new attacks involving deepfakes and voice imitations, to understand how these are taken into account by users, and to identify detection practices for these attacks. Research may help estimating the volume of data (written, vocal, messages, etc.) needed to carry out these attacks.
  3. Regulation: This angle will focus on the regulation of AI in general. Work in social sciences, design and law may contribute to a better understanding of the tools available to regulators to define a framework for the development of AI and build it with the various publics.

"Exercising rights"

This theme proposes to examine how users actually use their rights (withdrawal, access, portability, etc.). Work in the social sciences, law and design will help us understanding the reasons people exercise their rights, and to identify the most typical ways in which they do so. We will also look at ways of making these rights better known and more accessible to people. What concrete work is involved in exercising and accessing rights, from the point of view of both the user and the organizations that support them?

"Reidentification: from theory to practice"

More and more personal data sets are becoming accessible, notably via data brockers. This theme will look at the impact of these data sets, which are often described as anonymous. Research in computer science may help us to identify the data sets that need to be taken into account to consider concrete risks to the security of individuals and States, but also to understand the diversity of uses that are made of this data (OSINT, targeting, training, etc.). Research in law may look at the exercise of rights over this data, and the legal framework governing data-brokers.

"Science at the service of regulation"

The aim of this theme is to focus on scientific research in all disciplines used by or for regulators. Our aim will be to gain a better understanding of the practical issues involved in using scientific resources (typologies, survey results, methods, technical tools, recommendations) for regulatory purposes. What academic contributions have been solicited in the context of litigation or regulatory decisions? What are the possible conditions for interaction between science and regulation?

"International enforcement: When European legislation goes above European territories"

This theme looks at the obligations imposed by European legislation outside European territory. Research in law and the social sciences may help us identifying the problems faced by authorities in controlling and sanctioning outside their initial borders. What are the main solutions today? Is the power of European data protection authorities adequate?

Exploratory themes

In addition to the above themes, this year's Privacy Research Day call encourages the academic community to submit communications on exploratory themes to draw attention to less studied issues.

"Neurotechnologies and neuro-data"

This theme will look at the legal, technical and social issues raised by technologies that collect data directly from a person's neural systems, notably through consumer devices (smartphones, headsets, and augmented reality headsets), used for commercial purposes, convenience, or in the workplace.

"Victim of a data breach, now what?"

This theme focuses on the strategies and experiences of individuals facing a data breach (identity theft, phishing, deepfake, etc.). From a computer science perspective, we'll be asking: what are the rebound attacks enabled by existing vulnerabilities? Research in social sciences and law may help us to understand how individuals perceive their situation and try to exercise their rights. Does the violation have an impact on their uses afterwards (traumas, adaptation of behavior, implementation of strategies)? What role do third parties play in helping victims and those responsible?

"Is my phone listening to me?!"

This theme examines users' representations of the data collection mechanisms, whether these mechanisms are actual or presumed. From a social science point of view, we'll be asking where the feeling of being listened to comes from, and what effects this feeling has on usage. Does this feeling vary according to the type of device? Do users implement strategies to circumvent data collection? With the help of computer science, we may be looking at the extent to which listening is effective. We will attempt to identify strategies for dealing with those risks. Research in design will help us understand how to represent these risks to users (score, pictogram, observatories, comparative) to help them in their decisions.

 

► If you have any questions: [email protected]