Skip to content

Learning to walk again

February 28, 2013

Wow, I’ve found a lot of bugs in my code this week.

To be fair, I haven’t written any serious C++ code since 2003. A lot of idioms from Java and R have infected my programming style, so I’ve been making more than my usual number of typos, but these generally get picked up at compile time.

My Gibbs sampler is less than 400 lines of code, so it barely even qualifies as “serious” software development. Nevertheless, the flaws that I’ve uncovered are embarrassing newbie mistakes, such as the following:

  • setting the precision to the inverse square root of the standard deviation (power of -0.5 instead of -2)
  • using the sum of squares instead of the sum of squared differences (SSD)
  • forgetting to take the square root of the inverse gamma when returning the standard deviation
  • forgetting to set the old labels z to zero before sampling the new labels
  • calling abs(..) instead of fabs(..) and thereby rounding to the nearest integer
  • forgetting that z has one more row than y in my algorithm (this simplifies the definition of the neighbourhood matrix)
  • using n_cols instead of n_elem on an arma::rowvec didn’t behave as I expected
  • calling size(..) on an arma::rowvec definitely didn’t behave as I expected!

The only way that I tracked these down was by going back to basics. A hidden Potts model can be viewed as a specialization of an independent mixture model, which in turn is a specialization of a single distribution (i.e. a mixture model with only one component, k=1). With the assumption of additive Gaussian noise, we end up with the following inheritance diagram:

UML inheritance diagram for the R package mcmcPotts

UML inheritance diagram for the R package mcmcPotts

By adding these simpler classes, it enabled me to test methods like gibbsStdDev(…) in isolation. I also ran the code against simulated data, instead of my enormous cone-beam CT scans. The moral of the story is: baby steps, or you’ll trip over yourself!


From → C++

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Ella Kaye on Ella Kaye

Computational Bayesian statistics

Bayes' Food Cake

A bit of statistics, a bit of cakes. - Blogs to Learn R from the Community

Computational Bayesian statistics

Richard Everitt's blog

Computational Bayesian statistics

Let's Look at the Figures

David Firth's blog

Nicholas Tierney

Computational Bayesian statistics

Sweet Tea, Science

Two southern scientistas will be bringing you all that is awesome in STEM as we complete our PhDs. Ecology, statistics, sass.

Mad (Data) Scientist

Musings, useful code etc. on R and data science

Another Astrostatistics Blog

The random musings of a reformed astronomer ...

Darren Wilkinson's research blog

Statistics, computing, data science, Bayes, stochastic modelling, systems biology and bioinformatics

(badness 10000)

Computational Bayesian statistics

Igor Kromin

Computational Bayesian statistics


I can't get no

Xi'an's Og

an attempt at bloggin, nothing more...

Sam Clifford

Postdoctoral Fellow, Bayesian Statistics, Aerosol Science

%d bloggers like this: