Show tools for a specific language:
- Show all
- .NET
- C#
- C/C++
- Delphi
- F#
- Fortran 77/90
- Fortran2003
- Fortran2008
- Fortran2018
- Fortran77
- Fortran95
- FortranCalculus
- Haskell
- Interpreted
- Java
- Julia
- Kotlin
- LLVM
- Language independent
- Lua
- MATLAB
- OpenCL
- Python
- R
Alphabetical List of Tools
- Hackage :: ad (Haskell)
Forward-, reverse- and mixed-mode automatic differentiation combinators with a common API. Type-level "branding" is used to both prevent the end user from confusing infinitesimals and to limit unsafe access to the implementation details of each mode.
- Hackage :: backprop: Heterogeneous automatic differentation (Haskell)
Write your functions to compute your result, and the library will automatically generate functions to compute your gradient. Implements heterogeneous reverse-mode automatic differentiation, commonly known as "backpropagation". See https://backprop.jle.im for official introduction and documentation.
- Hackage :: fad (Haskell)
Forward Automatic Differentiation via overloading to perform nonstandard interpretation that replaces original numeric type with corresponding generalized dual number type. Existential type "branding" is used to prevent perturbation confusion. **Note: In general we recommend using the ad package maintained by Edward Kmett instead of this package.**
- Hackage :: inf-backprop (Haskell)
Automatic differentiation and backpropagation. We do not attract gradient tape. Instead, the differentiation operator is defined directly as a map between differentiable function objects. Such functions are to be combined in arrow style using (>>>), (***), first, etc. The original purpose of the package is an automatic backpropagation differentiation component for a functional type-dependent library for deep machine learning. See tutorial details.