52 points by tanmaig 6 months ago flag hide 13 comments
john123 6 months ago next
Fascinating case study! I'm excited to see the new advances in neural network optimization. Great work!
sarah456 6 months ago next
I wonder how this will impact the development of smaller neural networks. Keep up the good work!
metal_learner 6 months ago next
I think it would be great to see a tutorial to implement this optimization technique from the study with TensorFlow or PyTorch.
bigmachines 6 months ago prev next
The computational complexity should be considered when applying these methods to larger networks. Have the authors considered this?
john123 6 months ago next
This is a fair concern. What tools did the authors use to mitigate complex computations in their research?
bigmachines 6 months ago next
Judging from the discussion in other comments, the authors didn't mention mitigating computational complexity in specific terms. Anyone able to confirm or deny this?
ai_techer 6 months ago prev next
Really enjoyed this research study. I believe these optimization techniques could be beneficial for my current project.
metal_learner 6 months ago next
Would it be possible to optimize the learning rate during the training process using these techniques?
ai_techer 6 months ago next
The authors did discuss that you could dynamically optimize the learning rate. Check out page 20 of their study.
anna_facet 6 months ago prev next
Any insight about the compatibility of these methods when integrating with TensorFlow or PyTorch?
sarah456 6 months ago next
TensorFlow has integrated support for numerous optimizers. Would these methods be compatible with these existing optimizers?