|
|
Proximal gradient method with automatic selection of the parameter by automatic differentiation-
Article in a journal
- | |
|
Author(s)
Yingyi Li
, Haibin Zhang
, Zhibao Li
, Huan Gao
|
Published in Special issue of Optimization Methods & Software: Advances in Algorithmic Differentiation
Optimization Methods & Software |
Editor(s) Bruce Christianson, Shaun A. Forth, Andreas Griewank |
Year 2018 |
Publisher Taylor & Francis |
Abstract A class of non-smooth convex optimization problems which arise naturally from applications in sparse group Lasso, have attracted significant research efforts for parameters selection. For given parameters of the problem, proximal gradient method (PGM) is effective to solve it with linear convergence rate and the closed form solution can be obtained at each iteration. However, in many practical applications, the selection of the parameters not only affects the quality of solution, but also even determines whether the solution is right or not. In this paper, we study a new method to analyse the impact of the parameters on PGM algorithm to solve the non-smooth convex optimization problem. We present the sensitivity analysis on the output of an optimization algorithm over parameter, and show the advantage of the technique using automatic differentiation. Then, we propose a hybrid algorithm for selecting the optimal parameter based on the method of PGM. The numerical results show that the proposed method is effective for the solving of sparse signal recovery problem. |
Cross-References Christianson2018Sio |
BibTeX
@ARTICLE{
Li2018Pgm,
crossref = "Christianson2018Sio",
author = "Yingyi Li and Haibin Zhang and Zhibao Li and Huan Gao",
title = "Proximal gradient method with automatic selection of the parameter by automatic
differentiation",
journal = "Optimization Methods \& Software",
volume = "33",
number = "4--6",
pages = "708--717",
year = "2018",
publisher = "Taylor \& Francis",
doi = "10.1080/10556788.2018.1435648",
url = "https://doi.org/10.1080/10556788.2018.1435648",
eprint = "https://doi.org/10.1080/10556788.2018.1435648",
abstract = "A class of non-smooth convex optimization problems which arise naturally from
applications in sparse group Lasso, have attracted significant research efforts for parameters
selection. For given parameters of the problem, proximal gradient method (PGM) is effective to solve
it with linear convergence rate and the closed form solution can be obtained at each iteration.
However, in many practical applications, the selection of the parameters not only affects the
quality of solution, but also even determines whether the solution is right or not. In this paper,
we study a new method to analyse the impact of the parameters on PGM algorithm to solve the
non-smooth convex optimization problem. We present the sensitivity analysis on the output of an
optimization algorithm over parameter, and show the advantage of the technique using automatic
differentiation. Then, we propose a hybrid algorithm for selecting the optimal parameter based on
the method of PGM. The numerical results show that the proposed method is effective for the solving
of sparse signal recovery problem.",
booktitle = "Special issue of Optimization Methods \& Software: Advances in
Algorithmic Differentiation",
editor = "Bruce Christianson and Shaun A. Forth and Andreas Griewank"
}
| |
back
|
|