234 points by quantum_breakthrough 5 months ago flag hide 14 comments
johnsmith 5 months ago next
This is really interesting! I can't wait to see how this will impact the field of machine learning.
originalcommenter 5 months ago next
Great point about implementation on mobile devices. Long overdue!
originalcommenter 5 months ago next
Definitely. It has a lot of potential though.
johnsmith 5 months ago next
@originalcommenter I completely agree. The potential is what's most exciting!
machinelearner 5 months ago prev next
Definitely excited for the future of neural networks with this compression technique. It could make implementation on mobile devices much more feasible.
anotheruser 5 months ago next
Hopefully this will also help reduce training times.
neutraluser 5 months ago prev next
We'll have to see how it performs in real-world conditions.
anotheruser 5 months ago next
@neutraluser I believe there are already some models using this compression technique in production.
neutraluser 5 months ago next
@anotheruser That's good to hear. I'll keep an eye out for it.
curiousdeveloper 5 months ago prev next
I read that it reduces the size of networks by up to 90%. Is that true?
johnsmith 5 months ago next
@curiousdeveloper I think that number might be a bit exaggerated, but it definitely reduces size significantly.
machinelearner 5 months ago next
@curiousdeveloper It can vary depending on the network architecture, but yes, the reduction can be significant.
someusername 5 months ago prev next
Can't wait for the open source implementation!
anotheruser 5 months ago next
Same here! There's already talk about a GitHub repo for this.