2023-2024 Edition - LLM Privacy Project

2023-2024 Edition Overview

The 2023-2024 edition of the LLM Privacy Project focused on differential privacy, disclosure risk metrics, and practical applications in AI-driven environments.

Project Summary

The project was divided into three key phases:

Phase 1 - Technical Exploration

We analyzed inconsistencies in the definitions of differential privacy and its parameters. By reviewing 25 papers, we categorized definitions into four themes: noise injection, bounding information, hiding individuals, and miscellaneous. We identified fundamental challenges in differential privacy and its implementation.

Phase 2 - Experimental Insights

We conducted five experiments to explore the impact of differential privacy on data privacy and utility. The experiments covered the relationship between pre-processing and post-processing techniques, k-anonymity integration, data utility, statistical comparisons, and the feasibility of comparing differential privacy with k-anonymity.

Phase 3 - Policy Integration

This phase explored the legal alignment of differential privacy with existing privacy laws such as PIPEDA and Bill C-27. We provided guidance on how differential privacy can be used within anonymization frameworks and recommended clearer definitions and best practices for policymakers.

Final Report

You can download the final report for the 2023-2024 edition here:

Applicant and Team

Principal Investigator: Rafal Kulik, PhD, Professor of Mathematics and Statistics at the University of Ottawa.

Co-investigator: Teresa Scassa, PhD, Canada Research Chair in Information Law and Policy.

Industry Support

Privacy Analytics provided advisory support and coordinated access to relevant data and computing infrastructure.

Key Team Members

Student Researchers