Publication: Differentiable Programming Tensor Networks
Introduction
Applications
Tools
Research Groups
Workshops
Publications
   List Publications
   Advanced Search
   Info
   Add Publications
My Account
About
Impress

Differentiable Programming Tensor Networks

- Article in a journal -
 

Author(s)
Hai-Jun Liao , Jin-Guo Liu , Lei Wang , Tao Xiang

Published in
Phys. Rev. X

Year
2019

Publisher
American Physical Society

Abstract
Differentiable programming is a fresh programming paradigm which composes parameterized algorithmic components and optimizes them using gradient search. The concept emerges from deep learning but is not limited to training neural networks. We present the theory and practice of programming tensor network algorithms in a fully differentiable way. By formulating the tensor network algorithm as a computation graph, one can compute higher-order derivatives of the program accurately and efficiently using automatic differentiation. We present essential techniques to differentiate through the tensor networks contraction algorithms, including numerical stable differentiation for tensor decompositions and efficient backpropagation through fixed-point iterations. As a demonstration, we compute the specific heat of the Ising model directly by taking the second-order derivative of the free energy obtained in the tensor renormalization group calculation. Next, we perform gradient-based variational optimization of infinite projected entangled pair states for the quantum antiferromagnetic Heisenberg model and obtain state-of-the-art variational energy and magnetization with moderate efforts. Differentiable programming removes laborious human efforts in deriving and implementing analytical gradients for tensor network programs, which opens the door to more innovations in tensor network algorithms and applications.

AD Theory and Techniques
Hierarchical Approach

BibTeX
@ARTICLE{
         Liao2019DPT,
       title = "Differentiable Programming Tensor Networks",
       author = "Liao, Hai-Jun and Liu, Jin-Guo and Wang, Lei and Xiang, Tao",
       journal = "Phys. Rev. X",
       volume = "9",
       issue = "3",
       pages = "031041",
       numpages = "12",
       year = "2019",
       month = "Sep",
       publisher = "American Physical Society",
       doi = "10.1103/PhysRevX.9.031041",
       url = "https://link.aps.org/doi/10.1103/PhysRevX.9.031041",
       abstract = "Differentiable programming is a fresh programming paradigm which composes
         parameterized algorithmic components and optimizes them using gradient search. The concept emerges
         from deep learning but is not limited to training neural networks. We present the theory and
         practice of programming tensor network algorithms in a fully differentiable way. By formulating the
         tensor network algorithm as a computation graph, one can compute higher-order derivatives of the
         program accurately and efficiently using automatic differentiation. We present essential techniques
         to differentiate through the tensor networks contraction algorithms, including numerical stable
         differentiation for tensor decompositions and efficient backpropagation through fixed-point
         iterations. As a demonstration, we compute the specific heat of the Ising model directly by taking
         the second-order derivative of the free energy obtained in the tensor renormalization group
         calculation. Next, we perform gradient-based variational optimization of infinite projected
         entangled pair states for the quantum antiferromagnetic Heisenberg model and obtain state-of-the-art
         variational energy and magnetization with moderate efforts. Differentiable programming removes
         laborious human efforts in deriving and implementing analytical gradients for tensor network
         programs, which opens the door to more innovations in tensor network algorithms and applications.",
       ad_theotech = "Hierarchical Approach"
}


back
  

Contact:
autodiff.org
Username:
Password:
(lost password)