Skip to content

bayesImageS::mcmcPotts

December 3, 2017

This post looks at the convergence of the chequerboard Gibbs sampler for the hidden Potts model, in the presence of an external field. This algorithm is implemented as the function mcmcPotts in my R package, bayesImageS. Previous posts have looked at the convergence of Gibbs and Swendsen-Wang algorithms without an external field, as implemented in mcmcPottsNoData and swNoData functions.

The most accurate way to measure convergence is using the coupling time of a perfect sampling algorithm, such as coupling from the past (CFTP). However, we can obtain a rough estimate by monitoring the distribution of the sufficient statistic:

\text{S}(\mathbf{z}) = \sum_{\{i,j\} \in \mathcal{E}} \delta(z_i, z_j)

Where δ(x,y) is the Kronecker delta function. Note that this sum is defined over the unique undirected edges of the lattice, to avoid double-counting. Under this definition, the critical temperature of the q-state Potts model is \log\{1 + \sqrt{q}\}, or 0.88 for the Ising model with q=unique labels. Some papers state that the critical temperature of the Ising model is 0.44, but this is because they have used a different definition of S(z).

We will generate synthetic data for a sequence of values of the inverse temperature, β=(0.22,0.44,0.88,1.32,1.76,2.20):

library(bayesImageS)
library(doParallel)
set.seed(123)
q <- 2
beta <- c(0.22, 0.44, 0.88, 1.32, 1.76, 2.20)
mask <- matrix(1,nrow=500,ncol=500)
n <- prod(dim(mask))
neigh <- getNeighbors(mask, c(2,2,0,0))
block <- getBlocks(mask, 2)
edges <- getEdges(mask, c(2,2,0,0))
maxS <- nrow(edges)

cl <- makeCluster(min(4, detectCores()))
registerDoParallel(cl)

system.time(synth <- foreach (i=1:length(beta),
                      .packages="bayesImageS") %dopar% {
{
  gen <- list()
  gen$beta <- beta[i]
  # generate labels
  sw <- swNoData(beta[i], q, neigh, block, 200)
  gen$z <- sw$z
  gen$sum <- sw$sum[200]
  # now add noise
  gen$mu <- rnorm(2, c(-1,1), 0.5)
  gen$sd <- 1/sqrt(rgamma(2, 1.5, 2))
  gen$y <- rnorm(n, gen$mu[(gen$z[1:n,1])+1],
                    gen$sd[(gen$z[1:n,1])+1])
  gen
})
stopCluster(cl)
##    user  system elapsed 
##   0.307   0.065  20.271

Now let’s look at the distribution of Gibbs samples for the first dataset, using a fixed value of β:

priors <- list()
priors$k <- q
priors$mu <- c(-1,1)
priors$mu.sd <- rep(0.5,q)
priors$sigma <- rep(2,q)
priors$sigma.nu <- rep(1.5,q)
priors$beta <- rep(synth[[1]]$beta, 2)

mh <- list(algorithm="ex", bandwidth=1, adaptive=NA,
           auxiliary=1)
tm <- system.time(res <- mcmcPotts(synth[[1]]$y, neigh,
                            block, priors, mh, 100, 50))
print(tm)
ts.plot(res$sum, xlab="MCMC iterations", ylab=expression(S(z)))
abline(h=synth[[1]]$sum, col=4, lty=2)
##    user  system elapsed 
##  29.186   2.506   9.335

res1

As expected for β=0.22 with n= 500×500 pixels, convergence takes only a dozen iterations or so. The same is true for β=0.66:

priors$beta <- rep(synth[[2]]$beta, 2)
tm2 <- system.time(res2 <- mcmcPotts(synth[[2]]$y,
                      neigh, block, priors, mh, 100, 50))
print(tm2)
ts.plot(res2$sum, xlab="MCMC iterations", ylab=expression(S(z)))
abline(h=synth[[2]]$sum, col=4, lty=2)
##    user  system elapsed 
##  25.194   3.393  11.495

res2

Now with β=0.88, just below the critical temperature:

priors$beta <- rep(synth[[3]]$beta, 2)
tm3 <- system.time(res3 <- mcmcPotts(synth[[3]]$y,
                      neigh, block, priors, mh, 100, 50))
print(tm3)
ts.plot(res3$sum, xlab="MCMC iterations", ylab=expression(S(z)))
abline(h=synth[[3]]$sum, col=4, lty=2)
##    user  system elapsed 
##  26.658   3.361  11.444

res3

So far, so good. Now let’s try with β=1.32:

priors$beta <- rep(synth[[4]]$beta, 2)
tm4 <- system.time(res4 <- mcmcPotts(synth[[4]]$y,
                    neigh, block, priors, mh, 300, 150))
print(tm4)
ts.plot(res4$sum, xlab="MCMC iterations", ylab=expression(S(z)))
abline(h=synth[[4]]$sum, col=4, lty=2)
##    user  system elapsed 
##  88.414   9.170  30.481

res4

This doesn’t really count as slow mixing, since the Gibbs sampler has converged within 300 iterations for a lattice with 500×500 pixels. Compare how long it takes without the external field:

system.time(res5 <- mcmcPottsNoData(synth[[4]]$beta, q,
                                    neigh, block, 20000))
##     user   system  elapsed 
## 1036.752   46.607  317.952

res5

This explains why single-site Gibbs sampling should never be used for the auxiliary iterations in ABC or the exchange algorithm, but it is usually fine to use when updating the hidden labels. The Gaussian likelihood of the observed pixels, which is referred to in statistical mechanics as an “external field,” is assisting the model to converge to the correct stationary distribution. Without this additional information to give it a “nudge,” the Gibbs sampler is more likely to become stuck in a local mode. Note that all of these results have been for a fixed β. It is more difficult to assess convergence when β is unknown. A topic for a future post!

Advertisements

From → Imaging, MCMC, R

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Ella Kaye on Ella Kaye

Computational Bayesian statistics

Bayes' Food Cake

A bit of statistics, a bit of cakes.

RWeekly.org - Blogs to Learn R from the Community

Computational Bayesian statistics

Richard Everitt's blog

Computational Bayesian statistics

Let's Look at the Figures

David Firth's blog

Nicholas Tierney

Computational Bayesian statistics

Sweet Tea, Science

Two southern scientistas will be bringing you all that is awesome in STEM as we complete our PhDs. Ecology, statistics, sass.

Mad (Data) Scientist

Musings, useful code etc. on R and data science

Another Astrostatistics Blog

The random musings of a reformed astronomer ...

Darren Wilkinson's research blog

Statistics, computing, data science, Bayes, stochastic modelling, systems biology and bioinformatics

(badness 10000)

Computational Bayesian statistics

Igor Kromin

Computational Bayesian statistics

Statisfaction

I can't get no

Xi'an's Og

an attempt at bloggin, nothing more...

Sam Clifford

Postdoctoral Fellow, Bayesian Statistics, Aerosol Science

%d bloggers like this: