Skip to content

MCMski & BNPski

April 16, 2013

I’ve been notified that my contributed session for MCMSki IV has been accepted by the scientific committee. The conference will be held in Chamonix, in the French Alps, which will be only the second time in my life that I’ve seen snow! I plan to spend all of January 2014 in Europe, including a visit to my dear friend Mirko.

The abstracts for my session on Computational Methods for Image Analysis are after the jump.

Bayesian inference on a mixture model with spatial dependence
Dr Lionel Cucala, Université Montpellier II, France
We introduce a new technique to select the number of labels of a mixture model with spatial dependence. It consists in an estimation of the Integrated Completed Likelihood based on a Laplace’s approximation and a new technique to deal with the normalizing constant intractability of the hidden Potts model. Our proposal is applied to a real satellite image.

Perfect simulation for image analysis
A/Prof Mark Huber, Claremont McKenna College, USA
In this talk I will discuss perfect simulation for discrete and continuous autonormal models for image analysis. For the continuous autonormal model monotonic CFTP can be shown to always converge quickly, while for discrete models the rate of convergence depends sharply on the influence of the prior. Perfect simulation can also be used with Swendsen-Wang type chains. Partially recursive acceptance rejection can also be effective for a nontrivial class of models.

MCMC sampling from the hidden Potts model with informative priors
Mr Matt Moores, Queensland University of Technology, Australia
This talk will describe informative priors for the hidden Potts model and discuss strategies for improving mixing. The information gain from the observed pixel values in a noisy image can be insufficient for achieving reliable inference. Priors on the pixel intensities and the latent labels can be incorporated into the model, but these can introduce problems of their own. For example, the combination of the prior and likelihood can produce sharp peaks in the posterior distribution, resulting in latent labels that become “stuck” on a single value. The performance of several MCMC variants are compared, with application to medical imaging.

From → MCMC

  1. That’s good news. I wasn’t aware there was a BNPSki conference. Looks like some good sessions at MCMCSki; particularly Approximate Inference and Recent Developments in Software.

  2. BNPski is the free satellite workshop organised by Judith Rousseau for the last day of the conference. I agree that the round table session on BUGS, JAGS, Stan & BiiPS will be a highlight.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s


Computational Bayesian statistics

Bayes' Food Cake

A bit of statistics, a bit of cakes. - Blogs to Learn R from the Community

Computational Bayesian statistics

Richard Everitt's blog

Computational Bayesian statistics

Let's Look at the Figures

David Firth's blog

Nicholas Tierney

Computational Bayesian statistics

Sweet Tea, Science

Two southern scientistas will be bringing you all that is awesome in STEM as we complete our PhDs. Ecology, statistics, sass.

Mad (Data) Scientist

Musings, useful code etc. on R and data science

Darren Wilkinson's blog

Statistics, computing, functional programming, data science, Bayes, stochastic modelling, systems biology and bioinformatics

(badness 10000)

Computational Bayesian statistics

Igor Kromin

Computational Bayesian statistics


I can't get no

Xi'an's Og

an attempt at bloggin, nothing more...

Sam Clifford

Postdoctoral Fellow, Bayesian Statistics, Aerosol Science

%d bloggers like this: