Publication: A review of automatic differentiation and its efficient implementation
Introduction
Applications
Tools
Research Groups
Workshops
Publications
   List Publications
   Advanced Search
   Info
   Add Publications
My Account
About
Impress

A review of automatic differentiation and its efficient implementation

- Article in a journal -
 

Author(s)
Charles C. Margossian

Published in
WIREs Data Mining and Knowledge Discovery

Year
2019

Abstract
Derivatives play a critical role in computational statistics, examples being Bayesian inference using Hamiltonian Monte Carlo sampling and the training of neural networks. Automatic differentiation (ad) is a powerful tool to automate the calculation of derivatives and is preferable to more traditional methods, especially when differentiating complex algorithms and mathematical functions. The implementation of ad, however, requires some care to insure efficiency. Modern differentiation packages deploy a broad range of computational techniques to improve applicability, run time, and memory management. Among these techniques are operation overloading, region-based memory, and expression templates. There also exist several mathematical techniques which can yield high performance gains when applied to complex algorithms. For example, semi-analytical derivatives can reduce by orders of magnitude the runtime required to numerically solve and differentiate an algebraic equation. Open and practical problems include the extension of current packages to provide more specialized routines, and finding optimal methods to perform higher-order differentiation.

AD Theory and Techniques
Introduction

BibTeX
@ARTICLE{
         Margossian2019Aro,
       author = "Margossian, Charles C.",
       title = "A review of automatic differentiation and its efficient implementation",
       journal = "WIREs Data Mining and Knowledge Discovery",
       volume = "9",
       number = "4",
       pages = "e1305",
       keywords = "automatic differentiation, computational statistics, numerical methods",
       doi = "10.1002/widm.1305",
       url = "https://doi.org/10.1002/widm.1305",
       eprint = "https://www.onlinelibrary.wiley.com/doi/pdf/10.1002/widm.1305",
       abstract = "Derivatives play a critical role in computational statistics, examples being
         Bayesian inference using Hamiltonian Monte Carlo sampling and the training of neural networks.
         Automatic differentiation (AD) is a powerful tool to automate the calculation of derivatives and is
         preferable to more traditional methods, especially when differentiating complex algorithms and
         mathematical functions. The implementation of AD, however, requires some care to insure efficiency.
         Modern differentiation packages deploy a broad range of computational techniques to improve
         applicability, run time, and memory management. Among these techniques are operation overloading,
         region-based memory, and expression templates. There also exist several mathematical techniques
         which can yield high performance gains when applied to complex algorithms. For example,
         semi-analytical derivatives can reduce by orders of magnitude the runtime required to numerically
         solve and differentiate an algebraic equation. Open and practical problems include the extension of
         current packages to provide more specialized routines, and finding optimal methods to perform
         higher-order differentiation.",
       year = "2019",
       ad_theotech = "Introduction"
}


back
  

Contact:
autodiff.org
Username:
Password:
(lost password)