I’m an undergrad astronomy researcher. About a month ago my advisor asked me if I’d ever heard of a strange thing called “Bayesian statistics.” I had, thanks to lesswrong :D.
Recently there’s been a movement in astronomy research towards Bayes. Astronomy is one of the most statistical of the physicses, so it’s about time this happened. The recent rush has been almost entirely caused by the rise of MCMC algorithms and increasing computing power.
Anyway, my project has been to redo a bunch of statistics from an old paper of his with new data and the new statistics. At first I didn’t think it would be any fun, but I’ve made huge progress and MCMC is really cool. I’m lucky that my advisor is good and gave me a “big picture.” Turns out with decent statistics we’ll be able to constrain cosmological parameters like the ratio of dark matter to luminous matter and such. Over the last few weeks I’ve figured everything important out. I’ve done all my fits, made a whole lotta graphs, and I’m writing a paper. Yeah!
On the side I’m teaching myself general relativity and figuring out how to better teach special relativity.
Nice. How are you implementing MCMC? Are you using one of the Gibbs samplers like BUGS or JAGS?
I’m also an astronomy undergrad, working with infrared spectra. I don’t currently need MCMC, but it’s interesting and I might play around with JAGS in the future.
I didn’t use a Gibbs sampler but they seem useful. This article has some JAGS code you can check out. I actually used a python sampler called emcee. It’s nice for simpler fits, but gets messy when you have complicated priors and such. For the most part I was doing “simple” linear fits and I followed Hogg’s approach (look in section 8). There’s some article by Kelly in 2007 that goes over this also but that one is already dated.
Posterior distributions from samplers are really fun to play with.
I’m an undergrad astronomy researcher. About a month ago my advisor asked me if I’d ever heard of a strange thing called “Bayesian statistics.” I had, thanks to lesswrong :D.
Recently there’s been a movement in astronomy research towards Bayes. Astronomy is one of the most statistical of the physicses, so it’s about time this happened. The recent rush has been almost entirely caused by the rise of MCMC algorithms and increasing computing power.
Anyway, my project has been to redo a bunch of statistics from an old paper of his with new data and the new statistics. At first I didn’t think it would be any fun, but I’ve made huge progress and MCMC is really cool. I’m lucky that my advisor is good and gave me a “big picture.” Turns out with decent statistics we’ll be able to constrain cosmological parameters like the ratio of dark matter to luminous matter and such. Over the last few weeks I’ve figured everything important out. I’ve done all my fits, made a whole lotta graphs, and I’m writing a paper. Yeah!
On the side I’m teaching myself general relativity and figuring out how to better teach special relativity.
Nice. How are you implementing MCMC? Are you using one of the Gibbs samplers like BUGS or JAGS?
I’m also an astronomy undergrad, working with infrared spectra. I don’t currently need MCMC, but it’s interesting and I might play around with JAGS in the future.
I didn’t use a Gibbs sampler but they seem useful. This article has some JAGS code you can check out. I actually used a python sampler called emcee. It’s nice for simpler fits, but gets messy when you have complicated priors and such. For the most part I was doing “simple” linear fits and I followed Hogg’s approach (look in section 8). There’s some article by Kelly in 2007 that goes over this also but that one is already dated.
Posterior distributions from samplers are really fun to play with.