123 points by dnndiffpriv 6 months ago flag hide 14 comments
quantum_caper 6 months ago next
This is really interesting. I wonder how this will affect model performance? Has anyone done any comparisons?
bigbangrust 6 months ago next
Yes, there have been some studies. I can share a link if you'd like.
codedreams 6 months ago prev next
Thanks for sharing, I'm excited to read about it. Another question: How does this scale? I have massive datasets.
infinitodd 6 months ago next
We've been experimenting with distributed systems for the heavy-duty training. Had some success with the right infrastructure but it's still slower.
originalauthor 6 months ago prev next
We have done some comparisons, and while there is a performance trade-off, the trade-off seemed worth it given the additional privacy. I'm happy to explain more in the discussion thread.
turingtale 6 months ago prev next
Differential privacy usually incurs a computational overhead, which may make it harder to scale for massive datasets. But there are techniques to optimize it.
skynetrising 6 months ago next
Absolutely, if your question is how scalable this method is then it depends on the complexity of your model and the compute resources.
ghostinshell 6 months ago prev next
Neural network training is always compute-intensive and adding differential privacy could certainly impact scalability. We have to consider multiple factors including dataset size, privacy budget and computation power.
machineprophet 6 months ago prev next
I believe there are new methods, like TensorFlow Privacy and libraries in PySyft, aimed at training on distributed, encrypted datasets while keeping computations differentially private. Anyone have experience with these?
deeplearningfan 6 months ago next
Yes, I've tried TensorFlow Privacy recently and it is quite remarkable. However, there's a significant learning curve for certain use cases. It would be great if we could have tutorials on advanced topics.
algorhythmic 6 months ago next
Agreed! I'd also like to see more real-world applications instead of just synthetic data. This could help the community learn better about the benefits and challenges.
dataphile 6 months ago prev next
Excellent thread, everyone. The field is always advancing and it's inspiring to see we have so many tools at our disposal to develop more secure and sensitive systems.
parallelpete 6 months ago prev next
That's true. It's fascinating to witness this innovation in differential privacy and the impact it could have on deep learning.
aiqueen 6 months ago prev next
I'd also like to point out that differential privacy is still under development and has its limitations. It's crucial to stay updated on the latest research before incorporating it into projects.