Tight analyses for subgradient descent I: Lower bounds
Consider the problem of minimizing functions that are Lipschitz and convex, but not necessarily differentiable. We construct a function from this class for which the $Tþ$ iterate of subgradient descent has error $\Omega (\log (T)/\sqrt{T})$. This matches a known upper bound of $O(\log (T)/\sqrt{T})$...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Université de Montpellier
2024-07-01
|
Series: | Open Journal of Mathematical Optimization |
Subjects: | |
Online Access: | https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.31/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Consider the problem of minimizing functions that are Lipschitz and convex, but not necessarily differentiable. We construct a function from this class for which the $Tþ$ iterate of subgradient descent has error $\Omega (\log (T)/\sqrt{T})$. This matches a known upper bound of $O(\log (T)/\sqrt{T})$. We prove analogous results for functions that are additionally strongly convex. There exists such a function for which the error of the $Tþ$ iterate of subgradient descent has error $\Omega (\log (T)/T)$, matching a known upper bound of $O(\log (T)/T)$. These results resolve a question posed by Shamir (2012). |
---|---|
ISSN: | 2777-5860 |