The R package **gputools** has been consumed in the CRANpocalypse, but version 1.1 and earlier can still be downloaded as a source package from the archive. In order to compile it for macOS 10.12.6 (Sierra), you will need to install version 8 of the CUDA Toolkit as well as version 8.2.1 of the Xcode command-line tools. Even then, there are some major configuration issues that need to be dealt with. For the exceptionally brave, the excruciating details are below…

My second R package, serrsBayes, is now available on CRAN. **serrsBayes** uses a sequential Monte Carlo (SMC) algorithm to separate an observed spectrum into 3 components: the peaks ; baseline ; and additive white noise :

More details about the model and SMC algorithm are available in my preprint on arXiv (Moores et al., 2006; v2 2018). The following gives an example of applying **serrsBayes** to surface-enhanced Raman spectroscopy (SERS) from a previous paper (Gracie et al., 2016).

If you want to destroy my sweater

Hold this thread as I walk away

*Undone — Weezer*

I received an unexpected email about the new version 0.5-0 of bayesImageS:

Dear maintainer,

Please see the problems shown on

<https://cran.r-project.org/web/checks/check_results_bayesImageS.html>.Please correct before 2018-02-11 to safely retain your package on CRAN.

A new version 0.5-0 of my R package bayesImageS is now available on CRAN. To accompany it is a revision to my paper with Kerrie and Tony, “Scalable Bayesian inference for the inverse temperature of a hidden Potts model.” (Moores, Pettitt & Mengersen, arXiv:1503.08066v2). This paper introduces the parametric functional approximate Bayesian (PFAB) algorithm *(the ‘p’ is silent…)*, which is a form of Bayesian indirect likelihood (BIL).

This post looks at the convergence of the chequerboard Gibbs sampler for the hidden Potts model, in the presence of an external field. This algorithm is implemented as the function `mcmcPotts`

in my R package, **bayesImageS**. Previous posts have looked at the convergence of Gibbs and Swendsen-Wang algorithms without an external field, as implemented in `mcmcPottsNoData`

and `swNoData`

functions.

In the stats department at Warwick we have a reading group who are currently discussing Mark Huber‘s 2015 book, Perfect Simulation. Back in May, I presented the 3rd chapter on Coupling from the Past (CFTP; Propp & Wilson, 1996). The `mono_cftp_Ising`

function below implements monotonic CFTP for the Ising model (equivalent to the Potts model with only `q=2`

states). This algorithm returns a single, unbiased sample from the Ising model for a given inverse temperature, β. When combined with the exchange algorithm (Murray, Ghahramani & MacKay, 2006), this enables exact posterior inference for β. However, problems can occur when the value of β is too large, since the underlying single-site Gibbs sampler can fail to converge.

In July I attended a month-long programme at the Isaac Newton Institute, Cambridge, organised by the *i-like* project: *“Scalable inference; statistical, algorithmic, computational aspects.”* Videos of some selected talks are now available online, so I thought I would highlight some that in my opinion are particularly worth watching.

The new release of R 3.4.1 “Single Candle” for macOS 10.11 (El Capitan) and higher was built with clang 4.0.0 and gfortran 6.1. Given my previous issues with the clang++ compiler, I was curious to see how much of an improvement this would be. The details are below, but in brief my conclusion is that Stan and nVidia CUDA users should hold off for now, until some teething problems with the new toolchain have been sorted out. This is disappointing, since it looks like OpenMP is working (finally!) in this version of the compiler.

Over the next two weeks, I’ll be attending the SMC workshop in Uppsala, Sweden, and the annual conference of the Royal Statistical Society in Glasgow, UK. Abstracts for my presentations are below. Hope to see you there!

In other news, All 51 discussions (including mine) of “Beyond subjective and objective in statistics” by Gelman & Hennig (JRSS A, 2017) are now available online. Plenty of thoughtful commentary on the philosophy of science and statistics in particular.

## Read more…

Following up on a previous post, where I showed that the R function nls() was giving biased estimates in the presence of heteroskedastic, truncated noise. The **nlme** package provides the function gnls() for generalised least squares, but this seemed to involve defining a custom varFunc class to reweight the observations. For more detail on this option, refer to ch. 5 of Pinheiro & Bates (2000). Instead, I show how I formulated the likelihood in the Stan modelling language and estimated the parameter using Hamiltonian Monte Carlo (HMC). Thanks very much to Bob Carpenter for his help in getting this code to work.