Software
MoMo: adaptive learning rates for momentum methods. Pytorch package: . Also available in Optax.
step-back: a package for running optimization experiments in Pytorch. Contains training results of MoMo, SGD-M, Adam for many standard benchmarks.
ProxSPS: a Pytorch optimizer for Polyak step sizes with weight decay.
GGLasso: a Python package for computing general Graphical Lasso problems.
ncOPT: Prototype Python implementation of SQP-GS for nonsmooth, nonconvex constrained optimization.
In case you use any of my software and face issues or have feedback, please open an issue on Github.