|
|
A benchmark of selected algorithmic differentiation tools on some problems in computer vision and machine learning-
Article in a journal
- | |
|
Author(s)
Filip Srajer
, Zuzana Kukelova
, Andrew Fitzgibbon
|
Published in Special issue of Optimization Methods & Software: Advances in Algorithmic Differentiation
Optimization Methods & Software |
Editor(s) Bruce Christianson, Shaun A. Forth, Andreas Griewank |
Year 2018 |
Publisher Taylor & Francis |
Abstract Algorithmic differentiation (ad) allows exact computation of derivatives given only an implementation of an objective function. Although many ad tools are available, a proper and efficient implementation of ad methods is not straightforward. The existing tools are often too different to allow for a general test suite. In this paper, we compare 15 ways of computing derivatives including 11 automatic differentiation tools implementing various methods and written in various languages (C++, F#, MATLAB, Julia and Python), 2 symbolic differentiation tools, finite differences and hand-derived computation.We look at three objective functions from computer vision and machine learning. These objectives are for the most part simple, in the sense that no iterative loops are involved, and conditional statements are encapsulated in functions such as abs or logsumexp. However, it is important for the success of ad that such ‘simple’ objective functions are handled efficiently, as so many problems in computer vision and machine learning are of this form. |
Cross-References Christianson2018Sio |
AD Tools Adept, ADiMat, ADOL-C, DiffSharp, TAPENADE, Theano |
BibTeX
@ARTICLE{
Srajer2018Abo,
author = "Filip Srajer and Zuzana Kukelova and Andrew Fitzgibbon",
title = "A benchmark of selected algorithmic differentiation tools on some problems in computer
vision and machine learning",
journal = "Optimization Methods \& Software",
pages = "889--906",
year = "2018",
publisher = "Taylor \& Francis",
doi = "10.1080/10556788.2018.1435651",
url = "https://doi.org/10.1080/10556788.2018.1435651",
eprint = "https://doi.org/10.1080/10556788.2018.1435651",
crossref = "Christianson2018Sio",
volume = "33",
number = "4--6",
abstract = "Algorithmic differentiation (AD) allows exact computation of derivatives given only
an implementation of an objective function. Although many AD tools are available, a proper and
efficient implementation of AD methods is not straightforward. The existing tools are often too
different to allow for a general test suite. In this paper, we compare 15 ways of computing
derivatives including 11 automatic differentiation tools implementing various methods and written in
various languages (C++, F\#, MATLAB, Julia and Python), 2 symbolic differentiation tools,
finite differences and hand-derived computation.We look at three objective functions from computer
vision and machine learning. These objectives are for the most part simple, in the sense that no
iterative loops are involved, and conditional statements are encapsulated in functions such as abs
or logsumexp. However, it is important for the success of AD that such ‘simple’
objective functions are handled efficiently, as so many problems in computer vision and machine
learning are of this form.",
booktitle = "Special issue of Optimization Methods \& Software: Advances in
Algorithmic Differentiation",
editor = "Bruce Christianson and Shaun A. Forth and Andreas Griewank",
ad_tools = "Adept, ADiMat, ADOL-C, DiffSharp, TAPENADE, Theano"
}
| |
back
|
|