Call for papers Privacy Research Day, June 24th 2026, Paris

29 January 2026


The new edition of the Privacy Research Day will take place as part of hosting of the G7 Data Protection and Privacy Authorities by the CNIL in 2026. This call is open to all disciplines (Computer Science, Social Sciences, Law, Economics, Design, etc.).

The 5th Edition of the Privacy Research Day

This fifth edition of the Privacy Research Day will be organized in the context of the hosting by the CNIL in 2026 of the G7 Data Protection and Privacy Authorities Roundtable, an international forum gathering DPAs from G7 countries: France, United States, United Kingdom, Germany, Japan, Italy, Canada and the European Union.

Organized a day before the G7 DPAs Roundtable, the objective of the Privacy Research Day is to build bridges between worldwide researchers from all disciplines and delegates from G7 Data Protection and Privacy Authorities (DPA).

This interdisciplinary event is aimed at a broad audience accustomed with privacy and personal data protection, with the aim to create a unique and international exchange between legal experts, computer scientists, economists, designers, social scientists and regulators. Research involving multiple disciplines being particularly encouraged.

As every year, this edition will have a strong international focus, and we particularly encourage researchers from G7 members to submit their contributions.

Types of Contributions

Several types of contributions may be submitted:

  • Recently published research papers in international peer-reviewed conferences or journals;
  • For contributions in social sciences, results from ongoing research studies; research projects at an advanced stage or nearing completion.

Call for Papers

Proposals will be evaluated by CNIL experts based on the quality of submissions, their relevance to the work of the CNIL and the G7 DPAs, and the need to cover a diverse range of topics representing a variety of viewpoints. The paper will be reviewed in 2 steps:

  • the first step is based on the abstract, the venue or the type of project, and the description of the paper;
  • the second selection stage involved grouping the selected proposals around common themes, with the goal of creating coherent panels and promoting interdisciplinarity.

Read more information on the selection process on the LINC website.

Selected papers will give the opportunity to the author(s) to present their research before a wide audience, including the G7 DPAs in Paris on June 24, 2026.

Contributions must be submitted via this page:

Submission form

Topics

This list is non-exhaustive, and submission may concern any other topic relevant to privacy and data protection.

“Protection of Children and Vulnerable Users”

This topic explores specific privacy and security challenges faced by users whose personal, social, or professional situations create particular vulnerabilities. These include minors, seniors, and so-called excluded population or marginalized communities. The diversity of these profiles calls for reflection on adapting protection mechanisms. The focus will be on emerging risks and uses, whether related to the deployment of artificial intelligence (mental health, deepfakes) or new forms of control and protection (age verification, geolocation, parental control).

Key questions include:

  • What risks are most prevalent for different types of users? What concretely makes a user vulnerable?
  • What are protective strategies considered and/or applied by children/vulnerable users?
  • How can childrens/vulnerable users be protected while still granting them autonomy?
  • How can services be designed to adapt to different risk profiles?

“Privacy Without Borders”

The GDPR is said to have created a “Brussels Effect,” and we are witnessing a proliferation of data protection laws introducing new ideas. In the context of the G7, it would be particularly relevant to examine the cross-border effects of these texts: the “California Effect” with the CCPA / CPRA , the “Beijing Effect” with the PIPL , the concept of “Privacy by Design” originating in Ontario (Canada), and others.

This theme is more broadly open to work on international cooperation enabled or encouraged by these regulations, their effects on data transfers, and the role of Privacy Enhancing Technologies (PETs), which provide technical guarantees independently of regulatory frameworks and can therefore internationally act as compliance facilitators.

Key questions include:

  • What legal issues arise from the interaction of different national or international legislations?
  • What common denominators exist across different legislations?
  • What legal and technical tools are available for data exporters and importers?
  • Do PETs facilitate compliance with certain regulations?
  • How can users be informed of their rights under different legal frameworks?
  • How do privacy standards travel and adapt to regional specificities?

“Artificial Intelligence Through the Lens of…”

  • Data protection: AI systems require large volumes of (often personal) data for training and blur the line between data and AI models, from which personal data can often be extracted.
  • Security: AI systems can amplify the scale of known cyberattacks and enable entirely new ones (e.g., side-channel attacks that can decipher keystrokes via sound).
  • Regulation: Regulators and legal frameworks evolves and adapts to these challenges, particularly those associated with so-called “agentic” AI systems, including issues of liability and the potential use of such systems by regulators themselves.

Key questions include:

  • How do the interface and functioning of AI systems contribute the processing of more personal data?
  • What new threats are emerging, and how does AI change security practices or the exercise of rights?
  • In general, how are regulatory bodies exposed to new challenges?

“Wearables and IoT”

The integration of analytical capabilities and ubiquitous connectivity enables connected objects - whose size has significantly decreased - to continuously analyze their environment and collect data. The acceptability and use of these objects, particularly smart glasses, raise new privacy challenges. The theme also covers smart home devices, which are increasingly deployed in households in a similarly discreet manner.

Key questions include:

  • What the privacy perceptions related to wearables and IoT?
  • How does the decreasing size/visibility of connected objects affect a user's ability to remain aware of data collection, and what mechanisms can work for screenless devices?
  • Do wearables and IoT raise new legal questions and use cases different from those of smartphones?

“Science based regulation”

This theme aims to explore cases where scientific research is mobilized by or intended for regulators, and observe situations in which regulators transform scientific work into "serviceable truths" (technical evidence used to ground legal decisions, enforce sovereignty, and shape European digital policy). The objective is to better understand the practical challenges of using scientific resources (typologies, survey results, methods, tools, recommendations) to support regulation.

Key questions include:

  • What conditions enable effective interaction between science and regulation?
  • How do regulators turn complex scientific data into “regulatory objectivity” and administrative proof?
  • Which academic contributions have been used in litigation or regulatory decisions and how?
  • What quantitative or qualitative methods are used to quantify and sanction often “invisible” digital damages?

“Monetization of Personal Data and Business Models in the Digital Economy”

This theme welcomes contributions analyzing business models in the platform economy from a multidisciplinary perspective. Starting from the economic stakes of collecting and processing personal data and the associated risks, discussions will address current and foreseeable evolutions of these business models (e.g., consent-or-pay models), the role of technological developments, choice architectures, and corresponding regulations.

Key questions include:

  • How is the economic value of personal data perceived by platforms, and how does this compare to the value perceived by users?
  • Under what conditions does the “consent-or-pay” model respect the consent required by regulations like the GDPR? What is an appropriate price for such models?
  • To what extent do new regulation frameworks (DMA, Data Act, etc.) rebalance the power dynamics between “gatekeepers” and users? Under what provisions and tools?

“Data Security: From technical Tools to Social Practices”

This theme examines privacy protection as both a technical challenge and a matter of social adoption, analyzing how individuals and organizations equip themselves to face digital vulnerabilities.

  • Implementation of Privacy-Enhancing Technologies (PETs): this strand explores how techniques such as advanced encryption, differential privacy, and zero-knowledge proofs are now being integrated into consumer products and infrastructures. Contributions may analyze how these tools help meet GDPR requirements by natively reducing data collection, simplifying administrative obligations, enabling secure data flows, and in some instances enhancing competitiveness.
  • Usable Privacy and Security: this strand focuses on the interface between tools and users, drawing on research in social sciences, design, and human-computer interaction. The challenge is to understand the often-complex transition from the existence of a technical solution (e.g., password managers or two-factor authentication) to its effective adoption. Research may examine risk perception mechanisms, including gaps between real threats and users’ expressed fears.

Key questions include:

  • How do emerging PETs balance high-level data utility with robust security guarantees in commercial products?
  • To what extent can PETs "automate" GDPR requirements (like data minimization) and reduce the legal burden for organizations?
  • Why is there a persistent gap between the availability of PETs and their actual deployment?
  • How do users' mental models and "expressed fears" differ from actual technical threats, and how does this affect their defensive strategies?

CNIL–INRIA and CNIL–EHESS Awards

This call is accompanied by calls for participation in the CNIL–INRIA Award (computer science) and the CNIL–EHESS Award (social sciences). Applications for these awards can be done on the Privacy Research Day submission platform.

During submission, applicants may choose indeed to also submit their contribution for one of the awards, subject to eligibility criteria (publication less than three years old, in French or in English, within the relevant disciplines, and involving a laboratory located in the European Union for the CNIL–INRIA Award).

Unlike PRD contributions, the award laureates are not selected by CNIL staff but by independent juries. Selection criteria focus on overall scientific quality, originality and innovation, and relevance of the topic for the CNIL and all stakeholders—from the general public to private and public actors—on data protection and privacy issues.

On the day of the Privacy Research Day, award winners will be invited on stage to receive their prize from representatives of the partner institutions. Winners may benefit from coverage of travel and accommodation expenses for the event.

Submission deadline:

  • For the CNIL-EHESS Award: March 22
  • For the CNIL-Inria Privacy Award: 13 February

More information: privacyresearchday[@]cnil.fr