Intuitive

Turing models are easy to read and write. Specify models quickly and easily.

Universal

Turing supports models with stochastic control flow — models work the way you write them.

Adaptable

Turing is written fully in Julia, and can be modified to suit your needs.

A Quick Example

Turing’s modelling syntax allows you to specify a model quickly and easily. Straightforward models can be expressed in the same way as complex, hierarchical models with stochastic control flow.

Quick Start

@model gdemo(x, y) = begin
  # Assumptions
  σ ~ InverseGamma(2,3)
  μ ~ Normal(0,sqrt(σ))
  # Observations
  x ~ Normal(μ, sqrt(σ))
  y ~ Normal(μ, sqrt(σ))
end

Large Sampling Library

Turing provides Hamiltonian Monte Carlo sampling for differentiable posterior distributions, Particle MCMC sampling for complex posterior distributions involving discrete variables and stochastic control flow, and Gibbs sampling which combines particle MCMC, HMC and many other MCMC algorithms.

Samplers

Integrates With Other Machine Learning Packages

Turing supports Julia’s Flux package for automatic differentiation. Combine Turing and Flux to construct probabalistic variants of traditional machine learning models.

Bayesian Neural Network Tutorial

Citing Turing

If you use Turing for your own research, please consider citing the following publication: Hong Ge, Kai Xu, and Zoubin Ghahramani: Turing: Composable inference for probabilistic programming. AISTATS 2018 pdf bibtex