PPML18: Privacy Preserving Machine Learning - NIPS 2018 Workshop |
Website | https://ppml-workshop.github.io/ppml/ |
Submission link | https://easychair.org/conferences/?conf=ppml18 |
Submission deadline | October 16, 2018 |
Privacy Preserving Machine Learning -- NIPS 2018 Workshop
Montreal, December 8, 2018
We keep all information up to date on the official workshop homepage:
https://ppml-workshop.github.io/ppml/
This one day workshop focuses on privacy preserving techniques for training, inference, and disclosure in large scale data analysis, both in the distributed and centralized settings. We have observed increasing interest of the ML community in leveraging cryptographic techniques such as Multi-Party Computation (MPC) and Homomorphic Encryption (HE) for privacy preserving training and inference, as well as Differential Privacy (DP) for disclosure. Simultaneously, the systems security and cryptography community has proposed various secure frameworks for ML. We encourage both theory and application-oriented submissions exploring a range of approaches.
Submission Guidelines
Submissions in the form of extended abstracts must be at most 4 pages long (not including references) and adhere to the NIPS format. We do accept submissions of work recently published or currently under review. Submissions should be anonymized. The workshop will not have formal proceedings, but authors of accepted abstracts can choose to have a link to arxiv or a pdf published on the workshop webpage.
- Submission url: https://easychair.org/conferences/?conf=ppml18
- Submission deadline: October 8 October 16, 2018 (11:59pm AoE)
- Notification of acceptance: November 1 November 8, 2018
List of Topics
- secure multi-party computation techniques for ML
- homomorphic encryption techniques for ML
- hardware-based approaches to privacy preserving ML
- centralized and decentralized protocols for learning on encrypted data
- differential privacy: theory, applications, and implementations
- statistical notions of privacy including relaxations of differential privacy
- empirical and theoretical comparisons between different notions of privacy
- trade-offs between privacy and utility