Description
You can use Infer.NET to solve many different kinds of machine learning problems, from standard problems like classification, recommendation or clustering through to customised solutions to domainspecific problems. Infer.NET has been used in a wide variety of domains including information retrieval, bioinformatics, epidemiology, vision, and many others.
Infer.NET is published as open source on GitHub under the MIT license and also available as NuGet packages.
Infer.NET alternatives and similar packages
Based on the "Machine Learning and Data Science" category.
Alternatively, view Infer.NET alternatives based on common mentions on social networks and blogs.

ML.NET
Crossplatform opensource machine learning framework which makes machine learning accessible to .NET developers. 
Accord.NET
Machine learning framework combined with audio and image processing libraries (computer vision, computer audition, signal processing and statistics). 
TensorFlow.NET
.NET Standard bindings for Google's TensorFlow for developing, training and deploying Machine Learning models in C# and F#. 
AForge.NET
Framework for developers and researchers in the fields of Computer Vision and Artificial Intelligence (image processing, neural networks, genetic algorithms, machine learning, robotics). 
F# Data
F# type providers for accessing XML, JSON, CSV and HTML files (based on sample documents) and for accessing WorldBank data 
Deedle
Data frame and (time) series library for exploratory data manipulation with C# and F# support 
Accord.NET Extensions
Advanced image processing and computer vision algorithms made as fluent extensions. 
numl
Designed to include the most popular supervised and unsupervised learning algorithms while minimizing the friction involved with creating the predictive models. 
Spreads
Series and Panels for Realtime and Exploratory Analysis of Data Streams. Spreads library is optimized for performance and memory usage. It is several times faster than other open source projects. 
Catalyst
Catalyst Crossplatform Natural Language Processing (NLP) library inspired by spaCy, with pretrained models, outofthe box support for training word and document embeddings, and flexible entity recognition models. Part of the SciSharp Stack 
SciSharp STACK
A rich machine learning ecosystem for .NET created by porting the most popular Python libraries to C#.
Get performance insights in less than 4 minutes
* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest. Visit our partner's website for more details.
Do you think we are missing an alternative of Infer.NET or a related project?
README
KJIT
The goal of this project is to learn a kernelbased message operator which takes as input all incoming messages to a factor and produces a projected outgoing expectation propagation (EP) message. In ordinary EP, computing an outgoing message may involve solving a difficult integral for minimizing the KL divergence between the tilted distribution and the approximate posterior. Such operator allows one to bypass the computation of the integral by directly mapping all incoming messages into an outgoing message. Learning of such an operator is done online during EP. The operator is termed KJIT for Kernelbased JustInTime learning for passing EP messages.
Full details are in our UAI 2015 paper. Supplementary material is here.
Wittawat Jitkrittum, Arthur Gretton, Nicolas Heess,
S. M. Ali Eslami, Balaji Lakshminarayanan, Dino Sejdinovic, and Zoltán Szabó
"KernelBased JustInTime Learning for Passing Expectation Propagation Messages"
UAI, 2015
This project extends
Nicolas Heess, Daniel Tarlow, and John Winn.
“Learning to Pass Expectation Propagation Messages.”
NIPS, 2013.
http://media.nips.cc/nipsbooks/nipspapers/paper_files/nips26/1493.pdf.
and
S. M. Ali Eslami, Daniel Tarlow, Pushmeet Kohli, and John Winn
"JustInTime Learning for Fast and Flexible Inference."
NIPS, 2014.
http://papers.nips.cc/paper/5595justintimelearningforfastandflexibleinference.pdf
License
KJIT software is under MIT license.
The KJIT software relies on Infer.NET (freely available for noncommercial use) which is not included in our software. Even though the license of KJIT software is permissive, Infer.NET's license is not. Please refer to its license for details.
Repository structure
The repository contains
 Matlab code for experimenting in a batch learning setting. Experiments on new
kernels, factors, random features, message operators are all done in Matlab
in the first stage. Once the methods are developed, they are reimplemented in
C# to be operable in Infer.NET framework. EP inference is implemented in C#
using Infer.NET, not in Matlab. All Matlab code is in the
code
folder.  C# code for message operators in Infer.NET framework. The code for this
part is in
code/KernelEP.NET
which contains a C# project developed with Monodevelop (free crossplatform IDE) on Ubuntu 14.04. You should be able to use Visual studio in Windows to open the project file if it is more preferable.
All the code is written in Matlab and C# and expected to be crossplatform.
Include Infer.NET
The Matlab part of this project does not depend on the Infer.NET package. However, to use our KJIT message operator in the Infer.NET framework, you have to include Infer.NET package by taking the following steps.
 Download Infer.NET package from its Microsoft research
page.
Upon extracting the zip archive, you will see subfolders including
Bin
,Source
, and its license. Carefully read its license.  Copy
Infer.Compiler.dll
andInfer.Runtime.dll
from theBin
folder of the extracted archive intocode/KernelEP.NET/lib/Infer.NET/Bin/
of this repository. Without this step, when you open the project in Monodevelop, it will not compile due to the missing dependency.  Try to build the project. There should be no errors.
Useful submodules
In the development of the code for learning an EP message operator, some commonly used functions are reimplemented to better suit the need of this project. These functions might be useful for other works. These include
Incomplete Cholesky factorization. This is implemented in Matlab in such a way that any kernel and any type of data (not necessarily points from Euclidean space) can be used. The full kernel matrix is not preloaded. Only one row of the kernel matrix is computed at a time, allowing a large kernel matrix to be factorized. In this project, points are distributions and the kernel takes two distributions as input. See
IncompChol
.Dynamic matrix in Matlab. This is a matrix whose entries are given by a function
f: (I, J) > M
whereI, J
are index list andM
is a submatrix specified byI, J
. The dynamic matrix is useful when the underlying matrix is too large to fit into memory but entries can be computed on the fly when needed. In this project, this object is used to represent the data matrix when a large number of random features are used. Multiplication (to a regular matrix or a dynamic matrix) operations are implemented. SeeDynamicMatrix
andDefaultDynamicMatrix
.
Code usage
Please feel free to contact me (see wittawat.com) regarding code usage. For fun, visualization of this repository is available here.
*Note that all licence references and agreements mentioned in the Infer.NET README section above
are relevant to that project's source code only.