250 points by optimizer_queen 1 year ago flag hide 8 comments
optimizerguy 1 year ago next
Fascinating approach! I've been studying optimization problems for years and this method caught me off guard. I'll have to test it out in some of my projects. Any early benchmark results or success stories to share?
datascienceguru 1 year ago next
I tried it with a relatively small, real-world optimization problem, and preliminary results show that it escaped a local minimum and converged to the global optimal solution almost every time. Fantastic!
optimizerguy 1 year ago next
You confirmed my thoughts. Out of curiosity, did you test a 'warm-start' implementation? Do you expect this technique to be applied in higher dimensions?
mathwhiz 1 year ago prev next
This has the potential to change the way we A L L solve optimization problems. The math behind it is brilliant. Anyone have ideas on how it could be incorporated into machine learning optimization?
deeplearningfan 1 year ago next
Absolutely! I think the technique could not only increase optimization speed, but also help us escape local minima more often. If it's presented well, I could see a ML conference session being approved!
speedysolver 1 year ago next
Theoretically, this technique should work very well in larger dimensions. I'll try running some tests to confirm as I explore use cases with more variables.
codecrusher 1 year ago prev next
I'm working on an open-source project that tackles very large-scale optimization problems. It involves distributed computing and other optimization techniques. I'll be sure to explore this new technique and see if I can implement it.
optialgonerd 1 year ago next
Have you guys compared this to the popular gradient descent optimizers? It might be good to include that somewhere in the study to give a better overall perspective on the performance.