Publication: A Simple and Efficient Tensor Calculus
Introduction
Applications
Tools
Research Groups
Workshops
Publications
   List Publications
   Advanced Search
   Info
   Add Publications
My Account
About
Impress

A Simple and Efficient Tensor Calculus

- Part of a collection -
 

Author(s)
Sören Laue , Matthias Mitterreiter , Joachim Giesen

Published in
The Thirty-Fourth AAAI Conference on Artificial Intelligence, AAAI 2020, New York, NY, USA, February 7--12, 2020

Year
2020

Publisher
AAAI Press

Abstract
Computing derivatives of tensor expressions, also known as tensor calculus, is a fundamental task in machine learning. A key concern is the efficiency of evaluating the expressions and their derivatives that hinges on the representation of these expressions. Recently, an algorithm for computing higher order derivatives of tensor expressions like Jacobians or Hessians has been introduced that is a few orders of magnitude faster than previous state-of-the-art approaches. Unfortunately, the approach is based on Ricci notation and hence cannot be incorporated into automatic differentiation frameworks like TensorFlow, PyTorch, autograd, or JAX that use the simpler Einstein notation. This leaves two options, to either change the underlying tensor representation in these frameworks or to develop a new, provably correct algorithm based on Einstein notation. Obviously, the first option is impractical. Hence, we pursue the second option. Here, we show that using Ricci notation is not necessary for an efficient tensor calculus and develop an equally efficient method for the simpler Einstein notation. It turns out that turning to Einstein notation enables further improvements that lead to even better efficiency.

AD Theory and Techniques
Hierarchical Approach

BibTeX
@INPROCEEDINGS{
         Laue2020ASa,
       author = "S{\"{o}}ren Laue and Matthias Mitterreiter and Joachim Giesen",
       title = "A Simple and Efficient Tensor Calculus",
       pages = "4527--4534",
       publisher = "{AAAI} Press",
       year = "2020",
       url = "https://aaai.org/ojs/index.php/AAAI/article/view/5881",
       doi = "10.1609/aaai.v34i04.5881",
       ad_theotech = "Hierarchical Approach",
       abstract = "Computing derivatives of tensor expressions, also known as tensor calculus, is a
         fundamental task in machine learning. A key concern is the efficiency of evaluating the expressions
         and their derivatives that hinges on the representation of these expressions. Recently, an algorithm
         for computing higher order derivatives of tensor expressions like Jacobians or Hessians has been
         introduced that is a few orders of magnitude faster than previous state-of-the-art approaches.
         Unfortunately, the approach is based on Ricci notation and hence cannot be incorporated into
         automatic differentiation frameworks like TensorFlow, PyTorch, autograd, or JAX that use the simpler
         Einstein notation. This leaves two options, to either change the underlying tensor representation in
         these frameworks or to develop a new, provably correct algorithm based on Einstein notation.
         Obviously, the first option is impractical. Hence, we pursue the second option. Here, we show that
         using Ricci notation is not necessary for an efficient tensor calculus and develop an equally
         efficient method for the simpler Einstein notation. It turns out that turning to Einstein notation
         enables further improvements that lead to even better efficiency.",
       booktitle = "The Thirty-Fourth {AAAI} Conference on Artificial Intelligence, {AAAI} 2020, New
         York, NY, USA, February 7--12, 2020"
}


back
  

Contact:
autodiff.org
Username:
Password:
(lost password)