AD Tools for ALL
Introduction
Applications
Tools
Research Groups
Workshops
Publications
My Account
About
Impress

Show tools for a specific language:

77 tools found

Alphabetical List of Tools

  • ad  (Python)
    Transparent, calculator-style first and second-order derivatives.

  • AD Model Builder  (C/C++)
    AD Model Builder (ADMB) was specifically designed for complex highly-parameterized nonlinear models. ADMB uses automatic differentiation to provide the function optimizer with exact derivatives.

  • AD4CL  (C/C++,OpenCL)
    Automatic Differentiation for GPU computing

  • ADC  (C/C++)
    The vivlabs ADC Automatic Differentiation Software for C/C++ delivers rapid integration of automatic differentiation capability to your new and existing applications on all operating system platforms. ADC automatically exploits the sparsity within your equation matrices, which leads to winning performance for both small, large and extremely large applications.

  • ADEL  (C/C++)
    ADEL is an open-source C++ template library for Automatic Differentiation in forward mode. Works with CUDA out of the box.

  • Adept  (C/C++)
    Adept is an operator-overloading implementation of first-order forward- and reverse-mode automatic differentiation. It is very fast thanks to its use of expression templates and a very efficient tape structure: in reverse mode it is typically only 2.5-4 times slower than the original undifferentiated algorithm. It is released under the Apache License, Version 2.0.

  • ADF  (Fortran77,Fortran95)
    The vivlabs ADF Automatic Differentiation Software for FORTRAN delivers rapid integration of automatic differentiation capability to your new and existing applications on all operating system platforms. ADF automatically exploits the sparsity within your equation matrices, which leads to winning performance for both small, large and extremely large applications.

  • ADG  (Fortran 77/90,Fortran77,Fortran95)
    The Adjoint Code Generator (ADG) is a source-to-source transformation tool that is used for generating the adjoint model. Designed with the Least Program Behavior Decomposition Method, ADG supports global data dependent analysis and code optimization at a statement class.

  • ADIC  (C/C++)
    ADIC is a tool for the automatic differentiation (AD) of programs written in ANSI C. First derivatives are computed using forward mode with statement level preaccumulation. Second derivatives are computed using one of several forward mode strategies.

  • ADIFOR  (Fortran77)
    Given a Fortran 77 source code and a user's specification of dependent and independent variables, ADIFOR will generate an augmented derivative code that computes the partial derivatives of all of the specified dependent variables with respect to all of the specified independent variables in addition to the original result.

  • ADiGator  (MATLAB)
    Given a user function program together with information pertaining to the inputs of the program, ADiGator performs source transformation via the overloaded CADA class to generate any order derivative code.

  • ADiJaC  (Java)
    ADiJaC uses source code transformation to generate derivative codes in both the forward and the reverse modes of automatic differentiation.

  • ADiMat  (MATLAB)
    ADiMat uses a hybrid approach of source transformation and object orientied programming techniques to compute first and second order derivatives of MATLAB programs.

  • ADMAT / ADMIT  (MATLAB)
    ADMAT 2.0 enables you to differentiate MATLAB functions, and allows you to compute gradients, Jacobian matrices and Hessian matrices of nonlinear maps defined via M-files. Both forward and reverse modes are included.

  • ADNumber  (C/C++)
    Automatic differentiation of arbitrary order to machine precision. Uses templates and expression trees.

  • ADOL-C  (C/C++,Julia,Python)
    The package ADOL-C facilitates the evaluation of first and higher derivatives of vector functions that are defined by computer programs written in C or C++. The resulting derivative evaluation routines may be called from C/C++, Fortran, or any other language that can be linked with C. ADOL-C is distributed by the COIN-OR Foundation with the Common Public License CPL or the GNU General Public License GPL.

  • ADOL-F  (Fortran95)
    The tool ADOL-F was an early attempt to use the overloading capabilities newly introduced to Fortran to create an execution trace. The idea was to replicate the format of the ADOL-C execution trace (aka the "tape") so that one could reuse the ADOL-C drivers to do the derivative computation. Because of the lack of a "destructor" for the active type that enables the execution trace, there was no means to curtail the growth of active locations (see ADOL-C). The tool is no longer maintained and listed here just to keep the record complete.

  • APMonitor  (Interpreted)
    The APMonitor Modeling Language is an interpreted language for algebraic and differential equations. As an interpreted language, it has the ability to provide analytic derivatives to almost any programming language.

  • April-ANN  (Lua)
    April-ANN toolkit (A Pattern Recognizer In Lua with Artificial Neural Networks). This toolkit incorporates ANN algorithms, with other pattern recognition methods as hidden markov models (HMMs) among others. Currently, in experimental stage, it is possible to perform automatic differentiation, for advanced machine learning research. Feel free to contact us in case you want to collaborate into the development of the autodiff package.

  • AuDi  (C/C++,Python)
    AuDI is an open source, header only, C++ library that allows for AUtomated DIfferentiation implementing a Taylor truncated polynomial algebra (aka differential algebra). Its core is also exposed as a python module called pyaudi.

  • AUTODIF  (C/C++)
    A C++ library for automatic differentiation used as the building block for AD Model Builder

  • autodiff  (C/C++)
    autodiff is a C++17 library that uses modern and advanced programming techniques to enable automatic computation of derivatives in an efficient and extremely easy way.

  • AutoDiff .NET  (.NET)
    A simple .NET library for evaluating the value/gradient of a function using reverse-mode automatic differentiation.

  • AutoDiff_Library  (C/C++)
    This standalone AD library builds the computational graph and performs reverse gradient as well as reverse Hessian and Hessian-vector product algorithms on the graph. It is currently used in the parallel implementation of the Structured Modelling Language (http://www.maths.ed.ac.uk/ERGO/sml).

  • AUTO_DERIV  (Fortran77,Fortran95)
    AUTO_DERIV is a Fortran 90 module which can be used to evaluate the first and second derivatives of any continuous function with any number of independent variables. The function can be implicitly encoded in Fortran 77/90; only slight modifications in user code are required.

  • CasADi  (C/C++,MATLAB,Python)
    CasADi is a symbolic framework for numeric optimization implementing automatic differentiation in forward and reverse modes on sparse matrix-valued computational graphs.

  • Clad  (C/C++)
    Clad is an autodiff tool for C/C++ implemented as a plugin to the Clang compiler

  • cmpad  (Language independent)
    Compare Algorithmic Differentiation Tools

  • CoDiPack  (C/C++)
    CoDiPack is a C++-library that enables the computation of gradients in computer programs using Algorithmic Differentiation. It is based on the Operator Overloading approach and uses static polymorphism and expression templates, resulting in an extremely fast evaluation of adjoints or forward derivatives. It is specifically designed with HPC applications in mind.

  • COJAC  (Java)
    COJAC uses bytecode instrumentation to automatically enrich floats/doubles at runtime; the prototype offers both forward and reverse mode AD. The idea is presented in this short video: https://youtu.be/eAy71M34U_I?list=PLHLKWUtT0B7kNos1e48vKhFlGAXR1AAkF

  • ColPack  (C/C++)
    ColPack is a package consisting of implementations of various graph coloring and related algorithms for compression-based computation of sparse Jacobian and Hessian matrices using an Automatic Differentiation tool. ColPack is currently interfaced with ADOL-C. The coloring capabilities can be used for purposes other than derivative matrix computation.

  • COSY INFINITY  (Fortran77,Fortran95,C/C++)
    COSY is an open platform to support automatic differentiation, in particular to high order and in many variables. It also supports validated computation of Taylor models. The tools can be used as objects in F95 and C++ and through direct calls in F77 and C, as well as in the COSY scripting language which supports dynamic typing.

  • CppAD  (C/C++)
    CppAD uses operator overloading to compute derivatives of algorithms defined in C++. It is distributed by the COIN-OR Foundation with the Common Public License CPL or the GNU General Public License GPL. Installation procedures are provided for both Unix and Windows operating systems. The CppAD subversion repository can be used to view the source code. Extensive user and developer documentation is included.

  • CppADCodeGen  (C/C++)
    CppADCodeGen aims to extend the CppAD library in order to perform hybrid automatic differentiation, that is, to use operator overloading and generate/compile source code. Provides easy to use drivers for the generation and use of dynamic libraries under Linux. It also allows JIT compilation through Clang/LLVM. It is distributed under the Eclipse Public License 1.0 or the GNU General Public License 3 GPL.

  • CTaylor  (C/C++)
    High performance library to calculate with truncated taylor series. Can use multiple independent variables. Stores only potentially nonzero derivatives. Order of derivatives increases when using nonlinear operations until maximum (parameter) is reached. Based on googles libtaylor and heavily using boost::mpl.

  • dco/c++  (C/C++)
    dco/c++ implements first- and higher-order tangent and adjoint Algorithmic Differentiation (AD) by operator overloading in C++. It combines a cache-optimized internal representation generated with the help of C++ expression templates with an intuitive application programmer interface (API). dco/c++ has been applied successfully to a growing number of numerical simulations in the context of computational science, engineering and finance, for example, large-scale parameter calibration and shape optimization.

  • dco/map  (C/C++)
    dco/map is a C++11 tape-free operator overloading AD tool designed specifically to handle accelerators (GPUs, Xeon Phi, etc.). It uses template metaprogramming techniques to generate adjoint code at compile time; we call this meta adjoint programming.

  • DFT  (Fortran 77/90,Fortran77,Fortran95)
    DFT is a source-to-source transformation tool for generating the tangent linear model, and it supports global data dependent analysis and code optimization at a statement class.

  • DiffSharp  (.NET,F#,C#)
    DiffSharp is an automatic differentiation (AD) library implemented in the F# language. It supports C# and the other common language infrastructure languages. The library is under active development by Atılım Güneş Baydin and Barak A. Pearlmutter mainly for research applications in machine learning, as part of their work at the Brain and Computation Lab, Hamilton Institute, National University of Ireland Maynooth. Please visit the project website for detailed documentation and usage examples.

  • Enzyme  (C/C++,Fortran 77/90,Fortran2003,Fortran2008,Fortran77,Fortran95,Julia,LLVM,Language independent)
    Enzyme is a plugin that performs automatic differentiation (AD) of statically analyzable LLVM. By operating on the LLVM level Enzyme is able to perform AD across a variety of languages and perform optimization prior to AD

  • FAD  (C/C++)
    An implementation of automatic differentiation for programs written in C++ using operator overloading and expression templates.

  • FADBAD/TADIFF  (C/C++)
    FADBAD is a C++ library implementing the forward and reverse mode of automatic differentiation by operator overloading for C++ programs. TADIFF is a C++ program package for performing Taylor expansions on functions implemented as C++ programs.

  • FastAD  (C/C++)
    FastAD is a C++ template library of automatic differentiation supporting both forward and reverse mode to compute gradients and Hessians. It utilizes the latest features in C++17 and expression templates for efficient computation.

  • FFADLib  (C/C++)
    FFADLib implements overloaded C++ arithmetic operators and elementary function that employ fast automatic differentiation algorithms. Such algorithms use precomputed addresses of the derivatives in the data structure.

  • finmath-lib automatic differentiation extensions  (Java)
    Implementation of a stochastic automatic differentiation (AD / AAD for Monte-Carlo Simulations).

  • FortranCalculus Compiler  (FortranCalculus)
    FC-Compiler™ is a (free) Calculus-level Compiler that simplifies tweaking parameters in ones math model. The FortranCalculus (FC) language is for math modeling, simulation, and optimization. FC is based on Automatic Differentiation that simplifies computer code to an absolute minimum; i.e., a mathematical model, constraints, and the objective (function) definition. Minimizing the amount of code allows the user to concentrate on the science or engineering problem at hand and not on the (numerical) process requirements to achieve an optimum solution. FC-Compiler™ has many (50+) example problems with output for viewing and getting ideas on solving your own problems. Industry problems with solutions over the past fifty plus years have been put into a textbook to show the power of Calculus-level Problem-Solving. The textbook is available at goal-driven.net/textbooks.

  • ForwardDiff.jl  (Julia)
    The ForwardDiff package provides an implementation of forward-mode automatic differentiation (FAD) in Julia.

  • FunG  (C/C++)
    A library for simple and efficient generation of nonlinear functions and its first-, second-, and third-order derivatives. The focus is on invariant-based models, such as in nonlinear elasticity, and functions that pass the assembly process in FE-computations. Supports scalars, vectors, matrices and more general types satisfying a (relaxed) vector space structure.

  • GRESS  (Fortran77)
    GRESS (Gradient-Enhanced Software System) reads an existing Fortran code as input and produces an enhanced Fortran code as output. The enhanced code has additional new lines of coding for calculating derivative information analytically but using the rules of calculus. The enhanced model reproduces the reference model calculations and has the additional capability to compute derivatives and sensitivities specified by the user. The user also specifies whether the direct or adjoint method is to be used in computing sensitivities.

  • Hackage :: ad  (Haskell)
    Forward-, reverse- and mixed-mode automatic differentiation combinators with a common API. Type-level "branding" is used to both prevent the end user from confusing infinitesimals and to limit unsafe access to the implementation details of each mode.

  • Hackage :: backprop: Heterogeneous automatic differentation  (Haskell)
    Write your functions to compute your result, and the library will automatically generate functions to compute your gradient. Implements heterogeneous reverse-mode automatic differentiation, commonly known as "backpropagation". See https://backprop.jle.im for official introduction and documentation.

  • Hackage :: fad  (Haskell)
    Forward Automatic Differentiation via overloading to perform nonstandard interpretation that replaces original numeric type with corresponding generalized dual number type. Existential type "branding" is used to prevent perturbation confusion. **Note: In general we recommend using the ad package maintained by Edward Kmett instead of this package.**

  • Hackage :: inf-backprop  (Haskell)
    Automatic differentiation and backpropagation. We do not attract gradient tape. Instead, the differentiation operator is defined directly as a map between differentiable function objects. Such functions are to be combined in arrow style using (>>>), (***), first, etc. The original purpose of the package is an automatic backpropagation differentiation component for a functional type-dependent library for deep machine learning. See tutorial details.

  • HSL_AD02  (Fortran95)
    Provides automatic differentiation facilities for variables specified by Fortran code. Each active variable must be declared to be of a derived type defined by the package instead of real. The backward method is available for first and second derivatives. The forward method is available for derivatives of any order.

  • INTLAB  (MATLAB)
    INTLAB is Matlab toolbox for self-validating algorithms.

  • JAX  (Python)
    JAX is Autograd and XLA, brought together for high-performance machine learning research.

  • Kotlin𝛁  (Kotlin)
    Kotlin𝛁 is a framework for type-safe automatic differentiation in the Kotlin language. It allows users to express differentiable programs with higher-dimensional data structures and operators. We attempt to restrict syntactically valid constructions to those which are algebraically valid and can be checked at compile-time. By enforcing these constraints in the type system, it eliminates certain classes of runtime errors that may occur during the execution of a differentiable program. Due to type-inference in the language, most types may be safely omitted by the end user. Kotlin𝛁 strives to be expressive, safe, and notationally similar to mathematics.

  • NAGWare Fortran 95   (Fortran77,Fortran95)
    The NAGWare Fortran 95 Compiler is being extended to provide AD functionality. The first prototype will be distributed to beta testers by November 2002.

  • OpenAD  (C/C++,Fortran77,Fortran95)
    OpenAD is a source transformation tool that provides a language independent framework for the development and use of AD algorithms. It interfaces with language specific front-ends via an XML representation of the numerical core. Currently, Open64 is the front-end for FORTRAN and EDG/Sage3 the front-end for C/C++.

  • PCOMP  (Fortran77)
    PCOMP implements the forward and reverse mode for functions written in a FORTRAN-like modeling language, a subset of FORTRAN with a few extensions. First- and second-order derivatives are supported.

  • pyadolc  (Python)
    Python Wrapper of ADOL-C

  • pycppad  (Interpreted,Python)
    A boost ::python interface to the C++ Algorithmic Differentiation package CppAD. The pycppad package is distributed under the BSD license.

  • QuantAD  (.NET,C#,C/C++)
    QuantAD® is an Automatic Differentiation tool targeted at Quantitative Finance and industries with similar requirements. It offers a robust and efficient alternative to finite difference (“bumping”) for computing sensitivities. With minor changes to the existing program in C++ or C#, the user is able to AD-enable the whole code base and automatically compute a large number of sensitivities with dramatic performance speedups compared to the traditional bumping approach. QuantAD has been designed from the ground up to cope with large code bases found in Quantitative libraries using numerical methods such as Monte-Carlo, Finite Difference, and Lattice-Based Schemes.

  • R/ADR  (R)
    R/ADR uses source transformation to implement AD for the R language. It uses the transformation server at r-adr.de to perform the differentiation of R functions. The R package called adr provides the required runtime environment and a set of utility functions. Conceptually, R/ADR is very similar to ADiMat.

  • Rapsodia  (C/C++,Fortran95)
    Rapsodia is Python based code generator the creates C++ or Fortran libraries to efficiently compute higher order derivatives via operator overloading.

  • Sacado  (C/C++)
    The Sacado package provides automatic differentiation tools for C++ applications and is part of the larger Trilinos framework. It provides both forward and reverse modes, and leverages expression templates in the forward mode and a simplified tape data structure in the reverse mode for improved efficiency.

  • Stan Math Library  (C/C++)
    Forward- and reverse-mode implementations for probability, linear algebra, and ODE applications.

  • TAF  (Fortran 77/90,Fortran2003,Fortran2008,Fortran2018,Fortran77,Fortran95)
    Transformation of Algorithms in Fortran (TAF) is a source-to-source AD-tool for Fortran-95 programs. TAF supports forward and reverse mode of AD and Automatic Sparsity Detection (ASD) for detection of the sparsity structure of Jacobians.

  • TAMC  (Fortran77)
    TAMC is a source-to-source AD-tool for FORTRAN-77 programs. The generated code propagates derivatives in forward (tangent linear) or reverse (adjoint) mode. TAMC is very flexible thanks to many options and user directives.

  • TAPENADE  (C/C++,Fortran77,Fortran95)
    TAPENADE is a source-to-source AD tool. Given a FORTRAN77, FORTRAN95, or C source program, it generates its derivative in forward (tangent) or reverse (adjoint) mode. TAPENADE is the successor of ODYSSEE. TAPENADE is directly accessible through a web servlet, or can be downloaded locally.

  • TaylUR  (Fortran95)
    TaylUR is a Fortran 95 module to automatically compute the numerical values of a complex-valued function's derivatives w.r.t. several variables up to an arbitrary order in each variable, but excluding mixed derivatives.

  • The Taylor Center  (Delphi,Language independent)
    ODE Solver for Initial Value Problems (IVPs) given in the form of a system of 1st order explicit ODEs. The integration is based on Automatic Differentiation of the right hand sides entered in a conventional mathematical notation. The package is an All-In-One advanced GUI application with near real time animation of the solution in 2D or in 3D stereo with a conventional monitor and Red/Blue glasses.

  • TOMLAB /MAD  (MATLAB)
    The package TOMLAB /MAD package introduces automatic differentiation for the MATLAB users by operator overloading. TOMLAB /MAD with the TOMLAB Base Module is complete integration for advanced optimization application with more than 100 algorithms available. MAD can also be used as a stand-alone package for the MATLAB user.

  • TOMLAB /TomSym  (MATLAB)
    TomSym uses MATLAB objects and operator overloading to capture MATLAB procedures, and then generates source code for derivatives of any order. TomSym also integrates with the TOMLAB optimization environment to provide an easy-to-use interface for a broad range of optimization problems.

  • Treeverse / Revolve  (C/C++,Fortran77,Fortran95)
    Revolve implements an efficient checkpointing algorithm for the exact computation of a gradient of a functional consisting of a (pseudo) time-stepping procedure.

  • XAD  (C/C++)
    XAD is a comprehensive open-source C++ library for automatic differentiation. It targets production-quality code at any scale, striving for both ease of use and high performance.

  • YAO  (C/C++)
    YAO is dedicated to the programming of numerical models and data assimilation. It is based on a modulus graph methodology. Each modulus represents a function. YAO facilitates and generates the coding of the linear tangent and the adjoint of the model.

  

Contact:
autodiff.org
Username:
Password:
(lost password)