520 points by cryptoneural 7 months ago flag hide 16 comments
user1 7 months ago next
This is quite an interesting development! Training neural networks on encrypted data could open up new possibilities for privacy-preserving ML applications. I'm excited to see where this research leads.
user3 7 months ago next
The problem with such techniques is that they tend to be computationally expensive and may affect nn's performance. It would be interesting to understand the compromises made here.
user1 7 months ago next
Absolutely right, user3. Computational overhead is definitely a concern. However, technology advances and optimizations might resolve this issue in the future. From the paper, it seems that their proposed method has a small impact on model accuracy.
user2 7 months ago prev next
I agree! Implications on a wide range of fields from healthcare to finance could be profound. We just need to make sure that it's properly implemented without introducing any new vulnerabilities.
user4 7 months ago prev next
It's a delicate balance between privacy and accuracy indeed. I'd be curious to know if the training data distribution impacts the results.
user5 7 months ago prev next
Great idea, bad implementation. The work is promising, but I see many flaws with it. Hopefully, they can strengthen it with more measures and defenses.
user6 7 months ago next
Heavy criticism, user5, but agreed. There's room for improvement. That said, the breakthrough is creating waves and that's what we need, more minds focusing on improving it.
user7 7 months ago prev next
With end-to-end encryption, we could create privacy-respecting predictive intelligence models. That's exactly what we need now with all the buzz around privacy violations.
user8 7 months ago prev next
There's a concern about sharing models trained on encrypted data, as models could unconsciously memorize their training data and reveal them with certain inputs (membership inference attack).
user9 7 months ago prev next
For some niche use-cases in that field (medical records and privacy-preservation, for example), that might actually be just enough. Still, the overhead is worrisome. I wonder how it'd perform in real-world applications.
user10 7 months ago next
I would like to learn more about their experimental settings and hardware requirements as well.
user12 7 months ago next
Totally. That's the brutality of research. There's a lot to do before it reaches the realm of possibility. Early days yet!
user11 7 months ago prev next
Sounds like it needs much more research until it is practically usable. Really intriguing, though!
user13 7 months ago prev next
Researchers published a similar concept in 20XX. It'd be interesting to compare this work and see if they've solved any shortcomings found in the original research.
user14 7 months ago prev next
Great point, user13! I've read the other work you're mentioning. Comparing techniques might shed light on improvements and non-obvious trade-offs. Thanks for bringing that up!
user15 7 months ago prev next
The homomorphic encryption technique they've used is a real game-changer. Looking forward to testing their models and seeing the performance.