Getting started with Bayesian in R

Stan logo

There really is no excuse any more: getting started with Bayesian regression analysis in R is really simple.

Step 1: install rstanarm from CRAN

Step 2: replace lm() with stan_glm() in your code

Sure, you’ll probably want to learn about priors, and invest a little in understanding diagnostics such as those provided by ShinyStan. But rstanarm is really designed to work well out of the box (i.e. with your existing code).

What I really appreciate is that it has useful warnings and error messages, and extensive documentation. Sometimes the documentation shows that quantitative analysis has something to do with mathematics, but even those who skip the Greek letters and formulae will get enough guidance. You’ll get nudges to use your own priors rather than rely on the default priors, but in my experience for most simple applications the default priors work reasonably well. You’ll also get suggestions right on your screen what you can do when there are say divergent transitions.

Once you can handle rstanarm, you’ll find it easy to upgrade to brms, where you can still use your trusted syntax for regression models in base R.

That hairy caterpillar

Textbooks on Bayesian inference often refer to a ‘hairy caterpillar’ when describing the traceplot and what it should look like. It’s easy to come across examples what things look like, examples of this hairy caterpillar:

Hairy caterpillar

Likewise, we often see examples of the autocorrelation plot where everything is fine: a quick decrease to values around zero:


What seems less common are examples of what things should not look like, such as a traceplot that does not look like a ‘hairy caterpillar’ at all, or autocorrelation that really does not want to behave. Here I provide examples of both. How about this beauty, where each chain seems to be up to its own thing? This definitely does not look like convergence, nor do the chains mix well.

No hairy caterpillar

Or how about this trend? We need to stretch the definition of ‘quickly’ beyond any recognition to argue that this resembles a quick decrease.~

Bad autocorrelation

So yes, it’s back to the drawing board for this model… longer chains (with more thinning) may not suffice here.