This post looks at the convergence of the chequerboard Gibbs sampler for the hidden Potts model, in the presence of an external field. This algorithm is implemented as the function `mcmcPotts`

in my R package, **bayesImageS**. Previous posts have looked at the convergence of Gibbs and Swendsen-Wang algorithms without an external field, as implemented in `mcmcPottsNoData`

and `swNoData`

functions.

In the stats department at Warwick we have a reading group who are currently discussing Mark Huber‘s 2015 book, Perfect Simulation. Back in May, I presented the 3rd chapter on Coupling from the Past (CFTP; Propp & Wilson, 1996). The `mono_cftp_Ising`

function below implements monotonic CFTP for the Ising model (equivalent to the Potts model with only `q=2`

states). This algorithm returns a single, unbiased sample from the Ising model for a given inverse temperature, β. When combined with the exchange algorithm (Murray, Ghahramani & MacKay, 2006), this enables exact posterior inference for β. However, problems can occur when the value of β is too large, since the underlying single-site Gibbs sampler can fail to converge.

In July I attended a month-long programme at the Isaac Newton Institute, Cambridge, organised by the *i-like* project: *“Scalable inference; statistical, algorithmic, computational aspects.”* Videos of some selected talks are now available online, so I thought I would highlight some that in my opinion are particularly worth watching.

The new release of R 3.4.1 “Single Candle” for macOS 10.11 (El Capitan) and higher was built with clang 4.0.0 and gfortran 6.1. Given my previous issues with the clang++ compiler, I was curious to see how much of an improvement this would be. The details are below, but in brief my conclusion is that Stan and nVidia CUDA users should hold off for now, until some teething problems with the new toolchain have been sorted out. This is disappointing, since it looks like OpenMP is working (finally!) in this version of the compiler.

Over the next two weeks, I’ll be attending the SMC workshop in Uppsala, Sweden, and the annual conference of the Royal Statistical Society in Glasgow, UK. Abstracts for my presentations are below. Hope to see you there!

In other news, All 51 discussions (including mine) of “Beyond subjective and objective in statistics” by Gelman & Hennig (JRSS A, 2017) are now available online. Plenty of thoughtful commentary on the philosophy of science and statistics in particular.

## Read more…

Following up on a previous post, where I showed that the R function nls() was giving biased estimates in the presence of heteroskedastic, truncated noise. The **nlme** package provides the function gnls() for generalised least squares, but this seemed to involve defining a custom varFunc class to reweight the observations. For more detail on this option, refer to ch. 5 of Pinheiro & Bates (2000). Instead, I show how I formulated the likelihood in the Stan modelling language and estimated the parameter using Hamiltonian Monte Carlo (HMC). Thanks very much to Bob Carpenter for his help in getting this code to work.

Somehow I managed to sign up to give 4 talks at Warwick during the next 3 weeks (!) This Tuesday and next, I will be presenting the 3rd chapter of Mark Huber‘s 2015 book, Perfect Simulation, at the reading group of the same name. This week will focus on Coupling from the Past (Propp & Wilson, 1996) while next week I will present perfect slice sampling (Mira, Møller & Roberts, 2001). A finite sample drawn using CFTP is unbiased, therefore it can be incorporated into pseudo-marginal methods such as the exchange algorithm (Murray, Ghahramani & MacKay, 2006). More about CFTP in a future blog post, no doubt!