Pluri-disciplinary Call for Papers (Computer Science, Social Sciences, Design, Economy and Law) - Privacy Research Day, June 4, 2024, Paris
As part of the third edition of the Privacy Research Day, the CNIL is inviting the international scientific community to submit academic publications and research projects in computer science, social sciences, design, economy and law on the theme of privacy and personal data protection.
Context
The Privacy Research Day is an interdisciplinary in-person event (social sciences, law, computer science, design and economy) aimed at a wide audience familiar with privacy and data protection. The aim is to:
- stimulate interdisciplinary collaboration between legal experts, computer scientists, designers and social science researchers;
- build bridges between the world of regulation and the academic field, enabling scientifically-sound regulation;
- promote research with a strong impact on data protection and regulation.
Several types of contributions can be submitted:
- For all disciplines:
- scientific papers recently published in international peer-reviewed conferences or journals;
- research projects at an advanced stage or nearing completion.
- For submissions in the social sciences, design and law: on-going studies and projects can also be submitted.
To foster collaboration, participation is in-person only however, the event will be streamed on-live to allow remote attendance.
A non-public workshop will be organized the day after the conference.
Submission and evaluation
Communication proposals will be evaluated by CNIL experts, based on an assessment of the quality of submissions, their relevance to Data Protection Authorities work, and the variety of viewpoints and topics.
Important dates
- Submission deadline: March 8, 2024
- Notification to candidates: April 19, 2024
- Confirmation of presentation: April 26, 2024
- Submission of all documents: May 24, 2024
- Conference: June 4, 2024
Suggested topics (non-exhaustive)
Here are the topics that will be particularly considered for selection. Proposals covering other topics related to privacy and personal data protection will also be considered.
-
Protection of vulnerable populations (minors, elderly people, people with disabilities, etc.)
Various types of person may be considered vulnerable (or at-risk) because of their personal or professional situation.
- How do social sciences, design and law address this diversity and better protect these populations? What is the specificity associated with each type of vulnerability or vulnerable population?
- How can computer science research help put technology and regulation at their advantage? How can we detect practices that target only or mostly vulnerable populations?
Particular attention will be paid to work involving the protection of minors, elderly people or audiences perceived as "remote" or "prevented".
-
Interface, design patterns and right to information
The impact of information and design is increasingly recognized in regulation, and questions not only the content of the information to be provided about complex or opaque data processing, but also the way in which it is provided.
- How can work in computer science help identify the use of methods that limit people's choice and access to information?
- How can research in design, in cognitive science and in behavioral economics help informing users about best practices and steer them towards the most virtuous platforms?
- How can the results and evidence provided by these different disciplines be translated into law?
-
Cybersecurity and privacy technologies
Cyber-attacks are increasingly aimed at accessing personal data.
- What risks and opportunities are created by generative AI, particularly through the automation of certain tasks and content creation? Do computer science studies enable us to identify the most effective measures for detecting and responding to this type of attack? What are the new attack vectors for stealing data? What data sources are used to enhance attacks on personal data?
- Do social science and design studies tell us anything about how users perceive these new risks, and how to raise their awareness?
-
Effectiveness of regulation and comparative studies
Verifying the effectiveness of a policy is a classical problem for public action. It is all the more important in the case of data protection, as the regulator's resources are scarce, and citizens are demanding results.
- How can research social sciences (political science, sociology, economics) enable us to assess the effectiveness of a regulation, based on what factors or indicators? How do the different aspects of regulation (support, awareness-raising, deterrence, economic interests) and the different actors (NGOs, government, platforms, intermediaries, etc.) fit into the compliance process? Data protection authorities themselves produce studies in behavioral economics, sociology and design: what lessons have been learned from them?
- What computer science tools can be used to monitor trends and measure impacts?
Comparative analyses between countries or legislations, or monitoring of developments, will be particularly welcome.
-
Artificial intelligence, doing more with less
Frugality seems to be a taboo in the field of artificial intelligence technologies.
- Can computer science research help us reconcile performance and minimization, through data/dataset pruning for example? Can it also point us in the direction of AI system typologies to be promoted for rational data processing (embedded systems, edge computing, exploring and putting into perspective alternatives to centralized statistical models)?
-
AI Explicability and transparency
AI algorithms are often complex and non-linear, making it difficult to understand their decisions and raising issues of trust, responsibility and ethics.
- How can social science research help us understand the roles played by the various actors of AI’s conception and work (designer, annotator, user)? How are responsibilities and values distributed?
- Can work in design and computer science serve us to better explain, describe and even audit AI systems?
- How can law research support the improved exercise of people's rights (access, deletion) without hindering the development of these technologies?
-
Technologies born of regulation
The GDPR and now the Digital Services Act (DSA) and the Digital Markets Act (DMA) are having a significant impact on the development of new services and technologies.
- Does computer science research enable us to identify which Privacy Enhancing Technologies (PETs), or any other technology with an impact on data processing, enable better compliance with regulations? On the contrary, what tools have been developed in order to circumvent regulations?
- How can social sciences help us understand why these technologies are (or are not) adopted? How can we explain the effects of technologies on the effectiveness of regulation, compliance and the responsibilities of a variety of actors? What sort of strategies, practices or arguments emerge at such times? Are we observing a Brussels Effect harmonizing the norms or the emergences of local internets?
-
Identifying, recognizing and controlling people: the risks of misusing protective technologies
The protection of minors as well as the fight against fraud are driving the deployment of age verification and content detection solutions. Some of these solutions could be misused for abusive purposes, such as mass surveillance or discrimination, or controlled by attackers.
- How can research in law help to ensure that the use of these technologies is effective, responsible and in line with people's rights and expectations?
- Does computer science offer solutions to limit these risks?
- What do social science studies tell us about the issues involved in the increased deployment of these technologies? How can we better inform individuals and organizations about the associated risks?
-
Interpersonal surveillance
The deployment of new technologies and services accessible to the general public (social networks, cameras, geolocation tags) is renewing the question of non-descendant surveillance and control between peers, friends, family, colleagues, spouses...
- How can social sciences help us understand these forms of surveillance, and the risks they pose (harassment, control, manipulation, violence)? What role do social norms play in the use, and even misuse, of these tools?
- How can research in law help define the role of public authorities and regulators in raising awareness, monitoring and punishing these issues?
- Can tools developed in computer science limit the risks of interpersonal surveillance?
-
The economic costs and benefits of GDPR compliance
Economic impact studies have so far documented the costs of compliance, however little has been studied about the benefits that compliance can bring from an economic point of view. Regulators need more insight into these benefits (nature, quantification, distribution in the economy) to guide their policies and develop suitable compliance tools.
How can research in social sciences, in particular economics, help us to understand the trade-offs between the opportunity cost of non-compliance and its potential economic benefits? What role does trust play in the digital economy, from both a micro and macro perspective, for market development and innovation?
Contact
If you have any questions, please contact [email protected]