Turing is a Julia library for (universal) probabilistic programming. Current features include:
- Universal probabilistic programming with an intuitive modelling interface
- Hamiltonian Monte Carlo (HMC) sampling for differentiable posterior distributions
- Particle MCMC sampling for complex posterior distributions involving discrete variables and stochastic control flows
- Compositional MCMC sampling that combines particle MCMC and HMC
To use Turing, you need install Julia first and then install Turing.
You will need Julia 0.5 (or 0.4; but 0.5 is recommended), which you can get from the official Julia website.
It provides three options for users
- A command line version Julia/downloads
- A community maintained IDE Juno
- JuliaBox.com - a Jupyter notebook in the browser
For command line version, we recommend that you install a version downloaded from Julia's official website, as Turing may not work correctly with Julia provided by other sources (e.g. Turing does not work with Julia installed via apt-get due to missing header files).
Juno also needs the command line version installed. This IDE is recommended for heavy users who require features like debugging, quick documentation check, etc.
JuliaBox provides a pre-installed Jupyter notebook for Julia. You can take a shot at Turing without installing Julia on your machine in few seconds.
Turing is an officially registered Julia package, so the following should install a stable version of Turing:
Pkg.add("Turing")
If you want to use the latest version of Turing with some experimental features, you can try the following instead:
Pkg.update()
Pkg.clone("Turing")
Pkg.build("Turing")
Pkg.test("Turing")
If all tests pass, you're ready to start using Turing.
A Turing probabilistic program is just a normal Julia program, wrapped in a @model
macro, that uses some of the special macros illustrated below. Available inference methods include Importance Sampling (IS), Sequential Monte Carlo (SMC), Particle Gibbs (PG), Hamiltonian Monte Carlo (HMC).
# Define a simple Normal model with unknown mean and variance.
@model gdemo(x) = begin
s ~ InverseGamma(2,3)
m ~ Normal(0,sqrt(s))
x[1] ~ Normal(m, sqrt(s))
x[2] ~ Normal(m, sqrt(s))
return s, m
end
Inference methods are functions which take the probabilistic program as one of the arguments.
# Run sampler, collect results
c1 = sample(gdemo([1.5, 2]), SMC(1000))
c2 = sample(gdemo([1.5, 2]), PG(10,1000))
c3 = sample(gdemo([1.5, 2]), HMC(1000, 0.1, 5))
c4 = sample(gdemo([1.5, 2]), Gibbs(1000, PG(10, 2, :m), HMC(2, 0.1, 5, :s)))
# Summarise results
describe(c3)
# Plot results
p = Turing.plot(c3)
Turing.draw(p, fmt=:pdf, filename="gdemo-plot.pdf")
The arguments for each sampler are
- SMC: number of particles
- PG: number of particles, number of iterations
- HMC: number of samples, leapfrog step size, leapfrog step numbers
- Gibbs: number of samples, component sampler 1, component sampler 2, ...
Turing is an open source project so if you feel you have some relevant skills and are interested in contributing then please do get in touch. You can contribute by opening issues on Github or implementing things yourself and making a pull request. We would also appreciate example models written using Truing to add to examples.
Turing was originally created and is now managed by Hong Ge. Current and past Turing team members include Hong Ge, Adam Scibior, Matej Balog, Zoubin Ghahramani, Kai Xu, Emma Smith. You can see the full list of on Github: https://github.com/yebai/Turing.jl/graphs/contributors. Thanks for the important additions, fixes and comments.
To cite Turing, please refer to the technical report. Sample BibTeX entry is given below:
@ARTICLE{Turing2016,
author = {Ge, Hong and {\'S}cibior, Adam and Xu, Kai and Ghahramani, Zoubin},
title = "{Turing: A fast imperative probabilistic programming language.}",
year = 2016,
month = jun
}