Publication: More AD of Nonlinear AMPL Models: Computing Hessian Information and Exploiting Partial Separability
Introduction
Applications
Tools
Research Groups
Workshops
Publications
   List Publications
   Advanced Search
   Info
   Add Publications
My Account
About
Impress

More AD of Nonlinear AMPL Models: Computing Hessian Information and Exploiting Partial Separability

- incollection -
 

Author(s)
David M. Gay

Published in
Computational Differentiation: Techniques, Applications, and Tools

Editor(s)
Martin Berz, Christian Bischof, George Corliss, Andreas Griewank

Year
1996

Publisher
SIAM

Abstract
We describe computational experience with automatic differentiation of mathematical programming problems expressed in the modeling language AMPL. Nonlinear expressions are translated to loop-free code, which makes it easy to compute gradients and Jacobians by backward automatic differentiation. The nonlinear expressions may be interpreted or, to gain some evaluation speed at the cost of increased preparation time, converted to Fortran or C. We have extended the interpretive scheme to evaluate Hessian (of Lagrangian) times vector. Detecting partially separable structure (sums of terms, each depending, perhaps after a linear transformation, on only a few variables) is of independent interest, as some solvers exploit this structure. It can be detected automatically by suitable ``tree walks″. Exploiting this structure permits an ad computation of the entire Hessian matrix by accumulating Hessian times vector computations for each term, and can lead to a much faster computation of the Hessian than by computing the whole Hessian times each unit vector.

Cross-References
Berz1996CDT

AD Theory and Techniques
Hessian

BibTeX
@INCOLLECTION{
         Gay1996MAo,
       author = "David M. Gay",
       editor = "Martin Berz and Christian Bischof and George Corliss and Andreas Griewank",
       title = "More {AD} of Nonlinear {AMPL} Models: Computing {Hessian} Information and Exploiting
         Partial Separability",
       booktitle = "Computational Differentiation: Techniques, Applications, and Tools",
       pages = "173--184",
       publisher = "SIAM",
       address = "Philadelphia, PA",
       key = "Gay1996MAo",
       crossref = "Berz1996CDT",
       abstract = "We describe computational experience with automatic differentiation of mathematical
         programming problems expressed in the modeling language AMPL. Nonlinear expressions are translated
         to loop-free code, which makes it easy to compute gradients and Jacobians by backward automatic
         differentiation. The nonlinear expressions may be interpreted or, to gain some evaluation speed at
         the cost of increased preparation time, converted to Fortran or C. We have extended the interpretive
         scheme to evaluate Hessian (of Lagrangian) times vector. Detecting partially separable structure
         (sums of terms, each depending, perhaps after a linear transformation, on only a few variables) is
         of independent interest, as some solvers exploit this structure. It can be detected automatically by
         suitable ``tree walks''. Exploiting this structure permits an AD computation of the entire
         Hessian matrix by accumulating Hessian times vector computations for each term, and can lead to a
         much faster computation of the Hessian than by computing the whole Hessian times each unit vector.",
       keywords = "AMPL, Hessian, partial separability, tree walks.",
       year = "1996",
       ad_theotech = "Hessian"
}


back
  

Contact:
autodiff.org
Username:
Password:
(lost password)