DataScience@BI seminar with Cangxiong Chen
Is training data private? Uncovering gradient leakage and deep learning with differential privacy.
- Starts:12:00, 16 September 2025
- Ends:13:00, 16 September 2025
- Location:BI - campus Oslo, B3 inner area
- Contact:Siri Johnsen (siri.johnsen@bi.no)
Abstract
Is training data kept private when only the model updates are shared during training? In general, the answer is negative. Chen will present work investigating how this leakage of training data can happen. On the other hand, training with noisy gradients can provide protection against such leakage with theoretical guarantee (in terms of differential privacy). DP-SGD is a popular method using noisy gradients to train neural networks with differential privacy. Chen will discuss its shortcomings, some ongoing work and potential avenues of future work.
Key research areas:
Mathematics of machine learning and how to build learning algorithms with privacy, fairness, robustness and generalisability.
DataScience@BI seminar is organised by the Department of Data Science and Analytics at BI Norwegian Business School.