Screening for a Reweighted Penalized Conditional Gradient Method

The conditional gradient method (CGM) is widely used in large-scale sparse convex optimization, having a low per iteration computational cost for structured sparse regularizers and a greedy approach for collecting nonzeros. We explore the sparsity acquiring properties of a general penalized CGM (P-C...

Full description

Saved in:
Bibliographic Details
Main Authors: Sun, Yifan, Bach, Francis
Format: Article
Language:English
Published: Université de Montpellier 2022-06-01
Series:Open Journal of Mathematical Optimization
Subjects:
Online Access:https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.14/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1825205000217821184
author Sun, Yifan
Bach, Francis
author_facet Sun, Yifan
Bach, Francis
author_sort Sun, Yifan
collection DOAJ
description The conditional gradient method (CGM) is widely used in large-scale sparse convex optimization, having a low per iteration computational cost for structured sparse regularizers and a greedy approach for collecting nonzeros. We explore the sparsity acquiring properties of a general penalized CGM (P-CGM) for convex regularizers and a reweighted penalized CGM (RP-CGM) for nonconvex regularizers, replacing the usual convex constraints with gauge-inspired penalties. This generalization does not increase the per-iteration complexity noticeably. Without assuming bounded iterates or using line search, we show $O(1/t)$ convergence of the gap of each subproblem, which measures distance to a stationary point. We couple this with a screening rule which is safe in the convex case, converging to the true support at a rate $O(1/(\delta ^2))$ where $\delta \ge 0$ measures how close the problem is to degeneracy. In the nonconvex case the screening rule converges to the true support in a finite number of iterations, but is not necessarily safe in the intermediate iterates. In our experiments, we verify the consistency of the method and adjust the aggressiveness of the screening rule by tuning the concavity of the regularizer.
format Article
id doaj-art-98930d8acc154539859f189f9e86f91c
institution Kabale University
issn 2777-5860
language English
publishDate 2022-06-01
publisher Université de Montpellier
record_format Article
series Open Journal of Mathematical Optimization
spelling doaj-art-98930d8acc154539859f189f9e86f91c2025-02-07T14:02:43ZengUniversité de MontpellierOpen Journal of Mathematical Optimization2777-58602022-06-01313510.5802/ojmo.1410.5802/ojmo.14Screening for a Reweighted Penalized Conditional Gradient MethodSun, Yifan0Bach, Francis1Stonybrook University - Department of Computer Science, Stonybrook, New York, USAINRIA - Département d’Informatique de l’Ecole Normale Supérieure PSL Research University Paris, FranceThe conditional gradient method (CGM) is widely used in large-scale sparse convex optimization, having a low per iteration computational cost for structured sparse regularizers and a greedy approach for collecting nonzeros. We explore the sparsity acquiring properties of a general penalized CGM (P-CGM) for convex regularizers and a reweighted penalized CGM (RP-CGM) for nonconvex regularizers, replacing the usual convex constraints with gauge-inspired penalties. This generalization does not increase the per-iteration complexity noticeably. Without assuming bounded iterates or using line search, we show $O(1/t)$ convergence of the gap of each subproblem, which measures distance to a stationary point. We couple this with a screening rule which is safe in the convex case, converging to the true support at a rate $O(1/(\delta ^2))$ where $\delta \ge 0$ measures how close the problem is to degeneracy. In the nonconvex case the screening rule converges to the true support in a finite number of iterations, but is not necessarily safe in the intermediate iterates. In our experiments, we verify the consistency of the method and adjust the aggressiveness of the screening rule by tuning the concavity of the regularizer.https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.14/Dual screeningconditional gradient methodatomic sparsityreweighted optimization
spellingShingle Sun, Yifan
Bach, Francis
Screening for a Reweighted Penalized Conditional Gradient Method
Open Journal of Mathematical Optimization
Dual screening
conditional gradient method
atomic sparsity
reweighted optimization
title Screening for a Reweighted Penalized Conditional Gradient Method
title_full Screening for a Reweighted Penalized Conditional Gradient Method
title_fullStr Screening for a Reweighted Penalized Conditional Gradient Method
title_full_unstemmed Screening for a Reweighted Penalized Conditional Gradient Method
title_short Screening for a Reweighted Penalized Conditional Gradient Method
title_sort screening for a reweighted penalized conditional gradient method
topic Dual screening
conditional gradient method
atomic sparsity
reweighted optimization
url https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.14/
work_keys_str_mv AT sunyifan screeningforareweightedpenalizedconditionalgradientmethod
AT bachfrancis screeningforareweightedpenalizedconditionalgradientmethod