An interior proximal gradient method for nonconvex optimization

We consider structured minimization problems subject to smooth inequality constraints and present a flexible algorithm that combines interior point (IP) and proximal gradient schemes. While traditional IP methods cannot cope with nonsmooth objective functions and proximal algorithms cannot handle co...

Full description

Saved in:
Bibliographic Details
Main Authors: De Marchi, Alberto, Themelis, Andreas
Format: Article
Language:English
Published: Université de Montpellier 2024-07-01
Series:Open Journal of Mathematical Optimization
Subjects:
Online Access:https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.30/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1825205166642561024
author De Marchi, Alberto
Themelis, Andreas
author_facet De Marchi, Alberto
Themelis, Andreas
author_sort De Marchi, Alberto
collection DOAJ
description We consider structured minimization problems subject to smooth inequality constraints and present a flexible algorithm that combines interior point (IP) and proximal gradient schemes. While traditional IP methods cannot cope with nonsmooth objective functions and proximal algorithms cannot handle complicated constraints, their combined usage is shown to successfully compensate the respective shortcomings. We provide a theoretical characterization of the algorithm and its asymptotic properties, deriving convergence results for fully nonconvex problems, thus bridging the gap with previous works that successfully addressed the convex case. Our interior proximal gradient algorithm benefits from warm starting, generates strictly feasible iterates with decreasing objective value, and returns after finitely many iterations a primal-dual pair approximately satisfying suitable optimality conditions. As a byproduct of our analysis of proximal gradient iterations we demonstrate that a slight refinement of traditional backtracking techniques waives the need for upper bounding the stepsize sequence, as required in existing results for the nonconvex setting.
format Article
id doaj-art-99c9d7b6bc32414d957074d1e816cfe4
institution Kabale University
issn 2777-5860
language English
publishDate 2024-07-01
publisher Université de Montpellier
record_format Article
series Open Journal of Mathematical Optimization
spelling doaj-art-99c9d7b6bc32414d957074d1e816cfe42025-02-07T14:01:17ZengUniversité de MontpellierOpen Journal of Mathematical Optimization2777-58602024-07-01512210.5802/ojmo.3010.5802/ojmo.30An interior proximal gradient method for nonconvex optimizationDe Marchi, Alberto0Themelis, Andreas1University of the Bundeswehr Munich Department of Aerospace Engineering, Institute of Applied Mathematics and Scientific Computing Werner-Heisenberg-Weg 39, 85577 Neubiberg, GermanyKyushu University Faculty of Information Science and Electrical Engineering (ISEE) 744 Motooka, Nishi-ku, 819-0395 Fukuoka, JapanWe consider structured minimization problems subject to smooth inequality constraints and present a flexible algorithm that combines interior point (IP) and proximal gradient schemes. While traditional IP methods cannot cope with nonsmooth objective functions and proximal algorithms cannot handle complicated constraints, their combined usage is shown to successfully compensate the respective shortcomings. We provide a theoretical characterization of the algorithm and its asymptotic properties, deriving convergence results for fully nonconvex problems, thus bridging the gap with previous works that successfully addressed the convex case. Our interior proximal gradient algorithm benefits from warm starting, generates strictly feasible iterates with decreasing objective value, and returns after finitely many iterations a primal-dual pair approximately satisfying suitable optimality conditions. As a byproduct of our analysis of proximal gradient iterations we demonstrate that a slight refinement of traditional backtracking techniques waives the need for upper bounding the stepsize sequence, as required in existing results for the nonconvex setting.https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.30/Nonsmooth nonconvex optimizationinterior point methodsproximal algorithmslocally Lipschitz gradient
spellingShingle De Marchi, Alberto
Themelis, Andreas
An interior proximal gradient method for nonconvex optimization
Open Journal of Mathematical Optimization
Nonsmooth nonconvex optimization
interior point methods
proximal algorithms
locally Lipschitz gradient
title An interior proximal gradient method for nonconvex optimization
title_full An interior proximal gradient method for nonconvex optimization
title_fullStr An interior proximal gradient method for nonconvex optimization
title_full_unstemmed An interior proximal gradient method for nonconvex optimization
title_short An interior proximal gradient method for nonconvex optimization
title_sort interior proximal gradient method for nonconvex optimization
topic Nonsmooth nonconvex optimization
interior point methods
proximal algorithms
locally Lipschitz gradient
url https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.30/
work_keys_str_mv AT demarchialberto aninteriorproximalgradientmethodfornonconvexoptimization
AT themelisandreas aninteriorproximalgradientmethodfornonconvexoptimization
AT demarchialberto interiorproximalgradientmethodfornonconvexoptimization
AT themelisandreas interiorproximalgradientmethodfornonconvexoptimization