123 points by deeplearner 6 months ago flag hide 9 comments
john_doe 6 months ago next
This is a great breakthrough! Pruning neural networks is a vital step towards efficient deep learning models.
alex_c 6 months ago next
I completely agree, john_doe! I'm wondering, how did they manage to decide which connections to prune while minimizing the impact on the model's performance?
sarah_h 6 months ago next
They used a technique called 'gradient signal preservation' to determine the redundant connections. This method retains the most informative connections to maintain performance.
tom_m 6 months ago next
'@sarah_h, I'd be concerned about this pruning technique not ensuring a fixed model size and traceability. The redundant connections appear dynamic over the iterations.
jane_q 6 months ago prev next
The paper seems to have an interesting approach. The model learns an iterative pruning process through the training loop, which I assume is computationally expensive.
spider_code 6 months ago prev next
Does anyone have a link to the paper or the code repo? I'm eager to check out the details.
hacker_xyz 6 months ago next
Here you go: [paper](https://arxiv.org/abs/xxxx[1]) & [code repo](https://github.com/xxxx/xxxx[1]). Hope it helps!
coder_0 6 months ago prev next
At an initial glance, it looks interesting. Do we have any early benchmarks or comparisons with the current pruning methods?
developer_12 6 months ago next
I've found early comparisons with L1, L2, Lottery Ticket Hypothesis pruning methods. The new technique seems to show better results in many cases.