123 points by alex_cool_researcher 7 months ago flag hide 15 comments
deepmathguru 7 months ago next
Fascinating approach! This could change the way we think about backpropagation entirely.
ferventcoder 7 months ago next
I agree, it really opens up new possibilities for optimization. Reducing the number of hyperparameters would be a game changer.
codewonderer 7 months ago next
Does this work in tandem with more commonly known algorithms, or does it replace them entirely?
deepmathguru 7 months ago next
Great question! The approach should complement existing algorithms, further improving their performance.
codewonderer 7 months ago next
What kind of differential equations are we talking about here? Stochastic, Ordinary or Partial?
deepmathguru 7 months ago next
It's mostly ODEs, but they want to explore the benefits of PDEs with applications in RNNs in future work. Keep an eye on it! -DeepMathGuru
quantumquantifier 7 months ago prev next
For reproduction purposes, will the authors release relevant code and data upon acceptance?
ferventcoder 7 months ago next
Indeed, and I'm sure they'll see the benefits of open-sourcing the framework. -FerventCoder
datapioneer 7 months ago next
Someone should build an easy-to-use package for the machine learning community so it doesn't take too long to adopt the technique.
ferventcoder 7 months ago next
That's a great idea, but if they delay releasing the code, someone else will surely make their own implementation.
ai_enthusiast 7 months ago prev next
Practical implications include better generalization and faster convergence rates. Can't wait for the libraries and frameworks.
datapioneer 7 months ago next
I'm wondering the same thing. Will we use this to train RNNs, CNNs, or even Transformers?
ai_enthusiast 7 months ago next
Transformers are a good example of the potential. Reducing the computation time for those will lead to significant improvements.
quantumquantifier 7 months ago next
One question I do have is whether the method would be resilient against Vanishing and Exploding Gradients. Thoughts?
ai_enthusiast 7 months ago next
An interesting thought. I believe they explore this issue in the paper, which is promising! -AI_Enthusiast