Bare-bones implementation of robust dynamic Hamiltonian Monte Carlo methods.



Bare-bones implementation of robust dynamic Hamiltonian Monte Carlo methods.

Project Status: WIP – Initial development is in progress, but there has not yet been a stable, usable release suitable for the public. Build Status Coverage Status codecov.io Documentation


This package implements a modern version of the “No-U-turn sampler” in the Julia language, mostly as described in Betancourt (2017), with some tweaks.

In contrast to Mamba.jl and Klara.jl, which provide an integrated framework for building up a Bayesian model from small components, this package requires that you code a log-density function of the posterior, which also provides derivatives (for which of course you would use automatic differentiation).

Since most of the runtime is spent on calculating the log-likelihood, this allows the use of standard tools like profiling and benchmarking to optimize its performance.

Consequently, this package requires that the user is comfortable with the basics of the theory of Bayesian inference, to the extent of coding a (log) posterior density in Julia. Gelman et al (2013) and Gelman and Hill (2007) are excellent introductions.

Also, the building blocks of the algorithm are implemented using a functional (non-modifying) approach whenever possible, allowing extensive unit testing of components, while at the same time also intended to serve as a transparent, pedagogical introduction to the low-level mechanics of current Hamiltonian Monte Carlo samplers.


Examples are available in DynamicHMCExamples.jl.

Support and participation

For general questions, open an issue or ask on the Discourse forum.

The API is in the process of being refined to accommodate various modeling approaches. Users who wish to participate in the discussion should subscribe to the Github notifications (“watching” the package). Also, I will do my best to accommodate feature requests, just open issues.


Betancourt, M. J., Byrne, S., & Girolami, M. (2014). Optimizing the integrator step size for Hamiltonian Monte Carlo. arXiv preprint arXiv:1411.6669.

Betancourt, M. (2016). Diagnosing suboptimal cotangent disintegrations in Hamiltonian Monte Carlo. arXiv preprint arXiv:1604.00695.

Betancourt, M. (2017). A Conceptual Introduction to Hamiltonian Monte Carlo. arXiv preprint arXiv:1701.02434.

Gelman, A., Carlin, J. B., Stern, H. S., Dunson, D. B., Vehtari, A., & Rubin, D. B. (2013). Bayesian data analysis. : CRC Press.

Gelman, A., & Hill, J. (2007). Data analysis using regression and multilevel/hierarchical models.

Hoffman, M. D., & Gelman, A. (2014). The No-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo. Journal of Machine Learning Research, 15(1), 1593-1623.

First Commit


Last Touched

17 days ago


183 commits

Used By: