Publication: On the Impact of Automatic Differentiation on the Relative Performance of Parallel Truncated Newton and Variable Metric Algorithms
Introduction
Applications
Tools
Research Groups
Workshops
Publications
   List Publications
   Advanced Search
   Info
   Add Publications
My Account
About
Impress

On the Impact of Automatic Differentiation on the Relative Performance of Parallel Truncated Newton and Variable Metric Algorithms

- Article in a journal -
 

Area
Optimization

Author(s)
Laurence C. W. Dixon

Published in
SIAM J. Optim.

Year
1991

Abstract
The sparse doublet method for obtaining the gradient of a function or the Jacobian of a vector will be described and contrasted with reverse automatic differentiation. Its extension, the sparse triplet method for finding the Hessian of a function, will also be described and the effect of using these within classic optimisation algorithms discussed. Results obtained using a parallel implementation of sparse triplet automatic differentiation of a partially separable function on the Sequent Balance will be presented. In this paper it is shown that: (bullet) automatic differentiation can no longer be neglected as a method for calculating derivatives; (bullet) sparse triplets provide an effective method that can be implemented in parallel for calculating the Hessian matrix; (bullet) this approach can be combined effectively with the truncated Newton method when solving large unconstrained optimisation problems on parallel processors.

AD Theory and Techniques
Hessian, Parallelism

BibTeX
@ARTICLE{
         Dixon1991OtI,
       author = "Laurence C. W. Dixon",
       title = "On the Impact of Automatic Differentiation on the Relative Performance of Parallel
         Truncated {N}ewton and Variable Metric Algorithms",
       journal = "SIAM J. Optim.",
       pages = "475--486",
       key = "Dixon1991OtI",
       referred = "[More2001ADT].",
       year = "1991",
       volume = "1",
       abstract = "The sparse doublet method for obtaining the gradient of a function or the Jacobian
         of a vector will be described and contrasted with reverse automatic differentiation. Its extension,
         the sparse triplet method for finding the Hessian of a function, will also be described and the
         effect of using these within classic optimisation algorithms discussed. Results obtained using a
         parallel implementation of sparse triplet automatic differentiation of a partially separable
         function on the Sequent Balance will be presented. In this paper it is shown that: (bullet)
         automatic differentiation can no longer be neglected as a method for calculating derivatives;
         (bullet) sparse triplets provide an effective method that can be implemented in parallel for
         calculating the Hessian matrix; (bullet) this approach can be combined effectively with the
         truncated Newton method when solving large unconstrained optimisation problems on parallel
         processors.",
       keywords = "automatic differentiation; parallel computation; optimisation",
       url = "http://link.aip.org/link/?SJE/1/475/1",
       doi = "10.1137/0801028",
       ad_area = "Optimization",
       ad_theotech = "Hessian, Parallelism"
}


back
  

Contact:
autodiff.org
Username:
Password:
(lost password)