Sitemap
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
Pages
Posts
How to jointly tune learning rate and weight decay for AdamW
Published:
TL;DR: AdamW is often considered a method that decouples weight decay and learning rate. In this blog post, we show that this is not true for the specific way AdamW is implemented in Pytorch. We also show how to adapt the tuning strategy in order to fix this: when doubling the learning rate, the weight decay should be halved.
Optimization Nuggets: Stochastic Polyak Step-size, Part 2
Published:
Fabian Pedregosa invited me to write a joint blog post on a convergence proof for the stochastic Polyak step size (SPS).
Decay No More
Published:
I wrote a blog post which got published at the ICLR blog post track 2023. The post is titled Decay No More and explains the details of AdamW and its weight decay mechanism. Check it out here.
Solve it all and solve it fast: using numba for optimization in Python
Published:
When implementing optimization algorithms, we typically have to balance the following goals:
A collection of resources for creating open-source software packages
Published:
Making your research code open-source, tested and documented is quite simple nowadays. This post gives an overview of the most important steps and collects useful ressources, e.g. tutorials for Readthedocs, Sphinx (Gallery) and unit testing in Python.
portfolio
publications
GGLasso - a Python package for General Graphical Lasso computation
Published in Journal of Open Source Software, 2021
Recommended citation: F. Schaipp, O. Vlasovets, C. L. Müller. (2021). "GGLasso - a Python package for General Graphical Lasso computation." Journal of Open Source Software 1. 6(68), 3865. https://joss.theoj.org/papers/10.21105/joss.03865