Show tools for a specific language:
- Show all
- .NET
- C#
- C/C++
- Delphi
- F#
- Fortran 77/90
- Fortran2003
- Fortran2008
- Fortran2018
- Fortran77
- Fortran95
- FortranCalculus
- Haskell
- Interpreted
- Java
- Julia
- Kotlin
- LLVM
- Language independent
- Lua
- MATLAB
- OpenCL
- Python
- R
Alphabetical List of Tools
- DiffSharp (.NET,F#,C#)
DiffSharp is an automatic differentiation (AD) library implemented in the F# language. It supports C# and the other common language infrastructure languages. The library is under active development by Atılım Güneş Baydin and Barak A. Pearlmutter mainly for research applications in machine learning, as part of their work at the Brain and Computation Lab, Hamilton Institute, National University of Ireland Maynooth. Please visit the project website for detailed documentation and usage examples.
- MatLogica AADC (C#,C/C++,Python)
MatLogica AADC approach uses Code Transformation and Operator Overloading to efficiently compute the gradients of mathematical models by generating optimized machine code at runtime. This results in faster computing of the model and its first and higher order derivatives. The approach is particularly useful for models with many parameters that require frequent gradient updates during training.
- QuantAD (.NET,C#,C/C++)
QuantAD® is an Automatic Differentiation tool targeted at Quantitative Finance and industries with similar requirements. It offers a robust and efficient alternative to finite difference (“bumping”) for computing sensitivities. With minor changes to the existing program in C++ or C#, the user is able to AD-enable the whole code base and automatically compute a large number of sensitivities with dramatic performance speedups compared to the traditional bumping approach. QuantAD has been designed from the ground up to cope with large code bases found in Quantitative libraries using numerical methods such as Monte-Carlo, Finite Difference, and Lattice-Based Schemes.

