Publication: Automatic Differentiation for Riemannian Optimization on Low-Rank Matrix and Tensor-Train Manifolds
Introduction
Applications
Tools
Research Groups
Workshops
Publications
   List Publications
   Advanced Search
   Info
   Add Publications
My Account
About
Impress

Automatic Differentiation for Riemannian Optimization on Low-Rank Matrix and Tensor-Train Manifolds

- Article in a journal -
 

Area
Riemanian Optimization

Author(s)
Alexander Novikov , Maxim Rakhuba , Ivan Oseledets

Published in
SIAM Journal on Scientific Computing

Year
2022

Abstract
In scientific computing and machine learning applications, matrices and more general multidimensional arrays (tensors) can often be approximated with the help of low-rank decompositions. Since matrices and tensors of fixed rank form smooth Riemannian manifolds, one of the popular tools for finding low-rank approximations is to use Riemannian optimization. Nevertheless, efficient implementation of Riemannian gradients and Hessians, required in Riemannian optimization algorithms, can be a nontrivial task in practice. Moreover, in some cases, analytic formulas are not even available. In this paper, we build upon automatic differentiation and propose a method that, given an implementation of the function to be minimized, efficiently computes Riemannian gradients and matrix-by-vector products between an approximate Riemannian Hessian and a given vector.

AD Tools
T3F

BibTeX
@ARTICLE{
         Novikov2022ADf,
       author = "Novikov, Alexander and Rakhuba, Maxim and Oseledets, Ivan",
       title = "Automatic Differentiation for {R}iemannian Optimization on Low-Rank Matrix and
         Tensor-Train Manifolds",
       journal = "SIAM Journal on Scientific Computing",
       volume = "44",
       number = "2",
       pages = "A843--A869",
       year = "2022",
       doi = "10.1137/20M1356774",
       url = "https://doi.org/10.1137/20M1356774",
       abstract = "In scientific computing and machine learning applications, matrices and more
         general multidimensional arrays (tensors) can often be approximated with the help of low-rank
         decompositions. Since matrices and tensors of fixed rank form smooth Riemannian manifolds, one of
         the popular tools for finding low-rank approximations is to use Riemannian optimization.
         Nevertheless, efficient implementation of Riemannian gradients and Hessians, required in Riemannian
         optimization algorithms, can be a nontrivial task in practice. Moreover, in some cases, analytic
         formulas are not even available. In this paper, we build upon automatic differentiation and propose
         a method that, given an implementation of the function to be minimized, efficiently computes
         Riemannian gradients and matrix-by-vector products between an approximate Riemannian Hessian and a
         given vector.",
       ad_area = "Riemanian Optimization",
       ad_tools = "T3F"
}


back
  

Contact:
autodiff.org
Username:
Password:
(lost password)