Skip to content

Adaptive stochastic gradient method based on the universal gradient method. The universal method adjusts Lipsitz constant of the gradient on each step so that the loss function is majorated by the quadratic function.

Notifications You must be signed in to change notification settings

sverdoot/optimizer-SUG-torch

Repository files navigation

optimizer-SUG-torch

Adaptive stochastic gradient method based on the universal gradient method. The universal method adjusts Lipsitz constant of the gradient on each step so that the loss function is majorated by the quadratic function.

Please use https://nbviewer.jupyter.org/github/sverdoot in case you have problems with rendering .ipynb files.

About

Adaptive stochastic gradient method based on the universal gradient method. The universal method adjusts Lipsitz constant of the gradient on each step so that the loss function is majorated by the quadratic function.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages