Practical Privacy with Synthetic Data

In this post, we will implement a practical attack on synthetic data models that was described in the Secret Sharer: Evaluating and Testing Unintended Memorization in Neural Networks by Nicholas Carlini et. al. We will use this attack to see how synthetic data models with various neural network and differential privacy parameter settings actually work at protecting sensitive data and secrets in datasets. And there are some pretty surprising results. Read More

#adversarial, #privacy