Trading off 1-norm and sparsity against rank for linear models using mathematical optimization: 1-norm minimizing partially reflexive ah-symmetric generalized inverses

The M-P (Moore–Penrose) pseudoinverse has as a key application the computation of least-squares solutions of inconsistent systems of linear equations. Irrespective of whether a given input matrix is sparse, its M-P pseudoinverse can be dense, potentially leading to high computational burden, especia...

Full description

Saved in:
Bibliographic Details
Main Authors: Fampa, Marcia, Lee, Jon, Ponte, Gabriel
Format: Article
Language:English
Published: Université de Montpellier 2021-06-01
Series:Open Journal of Mathematical Optimization
Subjects:
Online Access:https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.6/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1825204916314963968
author Fampa, Marcia
Lee, Jon
Ponte, Gabriel
author_facet Fampa, Marcia
Lee, Jon
Ponte, Gabriel
author_sort Fampa, Marcia
collection DOAJ
description The M-P (Moore–Penrose) pseudoinverse has as a key application the computation of least-squares solutions of inconsistent systems of linear equations. Irrespective of whether a given input matrix is sparse, its M-P pseudoinverse can be dense, potentially leading to high computational burden, especially when we are dealing with high-dimensional matrices. The M-P pseudoinverse is uniquely characterized by four properties, but only two of them need to be satisfied for the computation of least-squares solutions. Fampa and Lee (2018) and Xu, Fampa, Lee, and Ponte (2019) propose local-search procedures to construct sparse block-structured generalized inverses that satisfy the two key M-P properties, plus one more (the so-called reflexive property). That additional M-P property is equivalent to imposing a minimum-rank condition on the generalized inverse. (Vector) 1-norm minimization is used to induce sparsity and, importantly, to keep the magnitudes of entries under control for the generalized-inverses constructed. Here, we investigate the trade-off between low 1-norm and low rank for generalized inverses that can be used in the computation of least-squares solutions. We propose several algorithmic approaches that start from a $1$-norm minimizing generalized inverse that satisfies the two key M-P properties, and gradually decrease its rank, by iteratively imposing the reflexive property. The algorithms iterate until the generalized inverse has the least possible rank. During the iterations, we produce intermediate solutions, trading off low 1-norm (and typically high sparsity) against low rank.
format Article
id doaj-art-00c936d60cc3420bb57e9fb0a089c9e4
institution Kabale University
issn 2777-5860
language English
publishDate 2021-06-01
publisher Université de Montpellier
record_format Article
series Open Journal of Mathematical Optimization
spelling doaj-art-00c936d60cc3420bb57e9fb0a089c9e42025-02-07T14:02:30ZengUniversité de MontpellierOpen Journal of Mathematical Optimization2777-58602021-06-01211410.5802/ojmo.610.5802/ojmo.6Trading off 1-norm and sparsity against rank for linear models using mathematical optimization: 1-norm minimizing partially reflexive ah-symmetric generalized inversesFampa, Marcia0Lee, Jon1Ponte, Gabriel2Federal University of Rio de Janeiro, BrazilUniversity of Michigan, USAFederal University of Rio de Janeiro, BrazilThe M-P (Moore–Penrose) pseudoinverse has as a key application the computation of least-squares solutions of inconsistent systems of linear equations. Irrespective of whether a given input matrix is sparse, its M-P pseudoinverse can be dense, potentially leading to high computational burden, especially when we are dealing with high-dimensional matrices. The M-P pseudoinverse is uniquely characterized by four properties, but only two of them need to be satisfied for the computation of least-squares solutions. Fampa and Lee (2018) and Xu, Fampa, Lee, and Ponte (2019) propose local-search procedures to construct sparse block-structured generalized inverses that satisfy the two key M-P properties, plus one more (the so-called reflexive property). That additional M-P property is equivalent to imposing a minimum-rank condition on the generalized inverse. (Vector) 1-norm minimization is used to induce sparsity and, importantly, to keep the magnitudes of entries under control for the generalized-inverses constructed. Here, we investigate the trade-off between low 1-norm and low rank for generalized inverses that can be used in the computation of least-squares solutions. We propose several algorithmic approaches that start from a $1$-norm minimizing generalized inverse that satisfies the two key M-P properties, and gradually decrease its rank, by iteratively imposing the reflexive property. The algorithms iterate until the generalized inverse has the least possible rank. During the iterations, we produce intermediate solutions, trading off low 1-norm (and typically high sparsity) against low rank.https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.6/Moore–Penrose pseudoinversegeneralized inverselinear modelleast squaressparse optimizationlow rank
spellingShingle Fampa, Marcia
Lee, Jon
Ponte, Gabriel
Trading off 1-norm and sparsity against rank for linear models using mathematical optimization: 1-norm minimizing partially reflexive ah-symmetric generalized inverses
Open Journal of Mathematical Optimization
Moore–Penrose pseudoinverse
generalized inverse
linear model
least squares
sparse optimization
low rank
title Trading off 1-norm and sparsity against rank for linear models using mathematical optimization: 1-norm minimizing partially reflexive ah-symmetric generalized inverses
title_full Trading off 1-norm and sparsity against rank for linear models using mathematical optimization: 1-norm minimizing partially reflexive ah-symmetric generalized inverses
title_fullStr Trading off 1-norm and sparsity against rank for linear models using mathematical optimization: 1-norm minimizing partially reflexive ah-symmetric generalized inverses
title_full_unstemmed Trading off 1-norm and sparsity against rank for linear models using mathematical optimization: 1-norm minimizing partially reflexive ah-symmetric generalized inverses
title_short Trading off 1-norm and sparsity against rank for linear models using mathematical optimization: 1-norm minimizing partially reflexive ah-symmetric generalized inverses
title_sort trading off 1 norm and sparsity against rank for linear models using mathematical optimization 1 norm minimizing partially reflexive ah symmetric generalized inverses
topic Moore–Penrose pseudoinverse
generalized inverse
linear model
least squares
sparse optimization
low rank
url https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.6/
work_keys_str_mv AT fampamarcia tradingoff1normandsparsityagainstrankforlinearmodelsusingmathematicaloptimization1normminimizingpartiallyreflexiveahsymmetricgeneralizedinverses
AT leejon tradingoff1normandsparsityagainstrankforlinearmodelsusingmathematicaloptimization1normminimizingpartiallyreflexiveahsymmetricgeneralizedinverses
AT pontegabriel tradingoff1normandsparsityagainstrankforlinearmodelsusingmathematicaloptimization1normminimizingpartiallyreflexiveahsymmetricgeneralizedinverses