Skip to content

Upcoming conferences and seminars

May 29, 2017

Somehow I managed to sign up to give 4 talks at Warwick during the next 3 weeks (!) This Tuesday and next, I will be presenting the 3rd chapter of Mark Huber‘s 2015 book, Perfect Simulation, at the reading group of the same name. This week will focus on Coupling from the Past (Propp & Wilson, 1996) while next week I will present perfect slice sampling (Mira, Møller & Roberts, 2001). A finite sample drawn using CFTP is unbiased, therefore it can be incorporated into pseudo-marginal methods such as the exchange algorithm (Murray, Ghahramani & MacKay, 2006). More about CFTP in a future blog post, no doubt!

Next Monday (June 5), I will be giving an introduction to splines, based on chapter 5 of the Elements of Statistical Learning (Hastie, Tibshirani & Friedman, 2009, 2nd ed.) supplemented by assorted other references:

Obviously, I’m not going to cover all of that in a one hour talk. The main ideas that I’ll aim to get across are Gibbs sampling and other estimation methods for the smoothing parameter, \lambda. This talk will be part of the reading group in statistical machine learning.

The last talk will be an introduction to approximate Bayesian computation (ABC) for the Warwick ML Club. This talk will largely be based on a previous talk that I gave at ABC in Sydney:

The summer is a busy time for conferences in the UK. I will be presenting a poster at the Workshop on New mathematical methods in computational imaging at Heriot-Watt University, Edinburgh, on June 30:

Approximate Posterior Inference for the Inverse Temperature of a Hidden Potts Model

There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm and approximate Bayesian computation (ABC). A serious drawback of these algorithms is that they do not scale well for models with a large state space. Markov random fields, such as the Potts model and exponential random graph model (ERGM), are particularly challenging because the number of discrete variables increases linearly with the size of the image or graph. The likelihood of these models cannot be computed directly, due to the presence of an intractable normalising constant. In this context, it is necessary to employ algorithms that provide a suitable compromise between accuracy and computational cost.

Bayesian indirect likelihood (BIL) is a class of methods that approximate the likelihood function using a surrogate model. This model can be trained using a pre-computation step, utilising massively parallel hardware to simulate auxiliary variables. We review various types of surrogate model that can be used in BIL. In the case of the Potts model, we introduce a parametric approximation to the score function that incorporates its known properties, such as heteroskedasticity and critical temperature. We demonstrate this method on 2D satellite remote sensing and 3D computed tomography (CT) images. We achieve a hundredfold improvement in the elapsed runtime, compared to the exchange algorithm or ABC. Our algorithm has been implemented in the R package “bayesImageS,” which is available from CRAN.

During July, I will be attending the programme “Scalable inference; statistical, algorithmic, computational aspects” at the Isaac Newton Institute (INI), Cambridge. This includes two workshops (so far, that I am aware of): Scalable Statistical Inference (July 3-7) and “Sampling methods in statistical physics and Bayesian inference” (July 18).

My abstract has also been accepted for the RSS International Conference in Glasgow, September 4-7. According to the conference programme, my talk has been scheduled for contributed session 6.5 Big Data, after lunch on the Wednesday. Hope to see you there!

Advertisements

From → MCMC

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Let's Look at the Figures

David Firth's blog

Nicholas Tierney

Computational Bayesian statistics

One weiRd tip

Computational Bayesian statistics

Series B'log

discussion blog for JRSS Series B papers

Mad (Data) Scientist

Musings, useful code etc. on R and data science

R-bloggers

R news and tutorials contributed by (750) R bloggers

Another Astrostatistics Blog

The random musings of a reformed astronomer ...

Darren Wilkinson's research blog

Statistics, computing, data science, Bayes, stochastic modelling, systems biology and bioinformatics

CHANCE

Computational Bayesian statistics

StatsLife - Significance magazine

Computational Bayesian statistics

(badness 10000)

Computational Bayesian statistics

Igor's Blog

Computational Bayesian statistics

Statisfaction

I can't get no

Xi'an's Og

an attempt at bloggin, nothing more...

Sam Clifford

Postdoctoral Fellow, Bayesian Statistics, Aerosol Science

Bayesian Research & Applications Group

Frontier Research in Bayesian Methodology & Computation

%d bloggers like this: