AD Tools for .NET
Introduction
Applications
Tools
Research Groups
Workshops
Publications
My Account
About
Impress

Show tools for a specific language:

3 tools found

Alphabetical List of Tools

  • AutoDiff .NET  (.NET)
    A simple .NET library for evaluating the value/gradient of a function using reverse-mode automatic differentiation.

  • DiffSharp  (.NET,F#,C#)
    DiffSharp is an automatic differentiation (AD) library implemented in the F# language. It supports C# and the other common language infrastructure languages. The library is under active development by Atılım Güneş Baydin and Barak A. Pearlmutter mainly for research applications in machine learning, as part of their work at the Brain and Computation Lab, Hamilton Institute, National University of Ireland Maynooth. Please visit the project website for detailed documentation and usage examples.

  • QuantAD  (.NET,C#,C/C++)
    QuantAD® is an Automatic Differentiation tool targeted at Quantitative Finance and industries with similar requirements. It offers a robust and efficient alternative to finite difference (“bumping”) for computing sensitivities. With minor changes to the existing program in C++ or C#, the user is able to AD-enable the whole code base and automatically compute a large number of sensitivities with dramatic performance speedups compared to the traditional bumping approach. QuantAD has been designed from the ground up to cope with large code bases found in Quantitative libraries using numerical methods such as Monte-Carlo, Finite Difference, and Lattice-Based Schemes.

  

Contact:
autodiff.org
Username:
Password:
(lost password)