AutoDiff alternatives and similar packages
Based on the "Mathematics" category.
Alternatively, view AutoDiff alternatives based on common mentions on social networks and blogs.

AngouriMath
New opensource crossplatform symbolic algebra library for C# and F#. Can be used for both production and research purposes. 
WPFMath
A collection of .NET libraries for rendering mathematical formulae using the LaTeX typesetting style, for the WPF and Avalonia XAMLbased frameworks 
Vim.Math3d
A .NET Standard 2.0 library for simple and efficient 3D math that is a featurerich replacement for System.Numerics https://vimaec.github.io/Math3D 
ALGLIB
ALGLIB is a crossplatform numerical analysis and data processing library. It supports several programming languages (C++, C#, Delphi) and several operating systems (Windows and POSIX, including Linux) [Proprietary] and [Free Edition]
InfluxDB  Purpose built for realtime analytics at any scale.
* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest.
Do you think we are missing an alternative of AutoDiff or a related project?
README
AppVeyor CI  NuGet Package 

Project Description
A library that provides moderately fast, accurate, and automatic differentiation (computes derivative / gradient) of mathematical functions.
AutoDiff provides a simple and intuitive API for computing function gradients/derivatives along with a fast algorithm for performing the computation. Such computations are mainly useful in iterative numerical optimization scenarios.
Code example
using AutoDiff;
class Program
{
public static void Main(string[] args)
{
// define variables
var x = new Variable();
var y = new Variable();
var z = new Variable();
// define our function
var func = (x + y) * TermBuilder.Exp(z + x * y);
// prepare arrays needed for evaluation/differentiation
Variable[] vars = { x, y, z };
double[] values = {1, 2, 3 };
// evaluate func at (1, 2, 3)
double value = func.Evaluate(vars, values);
// calculate the gradient at (1, 2, 3)
double[] gradient = func.Differentiate(vars, values);
// print results
Console.WriteLine("The value at (1, 2, 3) is " + value);
Console.WriteLine("The gradient at (1, 2, 3) is ({0}, {1}, {2})", gradient[0], gradient[1], gradient[2]);
}
}
Documentation
The [Documentation](docs/Readme.md) contains some basic tutorials, we have an article on CodeProject, and finally source code contains some code examples in addition to the code of the library itself.
Motivation
There are many open and commercial .NET libraries that have numeric optimization as one of their features (for example, Microsoft Solver Foundation, AlgLib,Extreme Optimization, CenterSpace NMath) . Most of them require the user to be able to evaluate the function and the function's gradient. This library tries to save the work in manually developing the function's gradient and coding it. Once the developer defines his/her function, the AutoDiff library can automatically evaluate and differentiate this function at any point. This allows easy development and prototyping of applications which require numerical optimization.
Features
 Moderate execution speeds. We aim computing a gradient within no more than 50 times the duration of function evaluation by manually tuned code.
 Composition of functions using arithmetic operators, Exp, Log, Power and userdefined unary and binary functions.
 Function gradient evaluation at specified points
 Function value evaluation at specified points
 Computes gradients using ReverseMode AD algorithm in linear time, which is substantially faster than numerical gradient approximation for multivariate functions.
Using in research papers
If you like the library and it helps you publish a research paper, please cite the paper I originally wrote the library for [geosemantic.bib](docs/Home_geosemantic.bib)
Used by
 Andreas Witsch, Hendrik Skubch, Stefan Niemczyk, Kurt Geihs Using incomplete satisfiability modulo theories to determine robotic tasks Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference
 Michael Kommenda, Michael Affenzeller, Gabriel Kronberger, Stephan M. Winkler Nonlinear Least Squares Optimization of Constants in Symbolic Regression Revised Selected Papers of the 14th International Conference on Computer Aided Systems Theory  EUROCAST 2013  Volume 8111
 Alex Shtof, Alexander Agathos, Yotam Gingold, Ariel Shamir, Daniel CohenOr Geosemantic Snapping for SketchBased Modeling Eurographics 2013 proceedings (code repository)
 Michael Kommenda, Gabriel Kronberger, Stephan Winkler, Michael Affenzeller, Stefan Wagner Effects of constant optimization by nonlinear least squares minimization in symbolic regression Proceeding of the fifteenth annual conference companion on Genetic and evolutionary computation conference companion
 Hendrik Skubch, Solving nonlinear arithmetic constraints in soft realtime environments Proceedings of the 27th Annual ACM Symposium on Applied Computing
 AlicaEngine  A cooperative planning engine for robotics. You can see it in action in this video
 HeuristicsLab  a framework for heuristic and evolutionary algorithms that is developed by members of the Heuristic and Evolutionary Algorithms Laboratory (HEAL)