Comments on: Line Emission [EotW] http://hea-www.harvard.edu/AstroStat/slog/2008/eotw-line-emission/ Weaving together Astronomy+Statistics+Computer Science+Engineering+Intrumentation, far beyond the growing borders Fri, 01 Jun 2012 18:47:52 +0000 hourly 1 http://wordpress.org/?v=3.4 By: TomLoredo http://hea-www.harvard.edu/AstroStat/slog/2008/eotw-line-emission/comment-page-1/#comment-229 TomLoredo Thu, 22 May 2008 04:00:13 +0000 http://hea-www.harvard.edu/AstroStat/slog/?p=295#comment-229 Hyunsook, that's an encouraging story—that MCMC is being taught so effectively to astronomers at Harvard. As you noted, there's nothing about MCMC that is particularly Bayesian (in fact, even Bayesian use of it is typically frequentist in flavor), so there's no need to start with Bayes's theorem. After all, (Bayesian) statisticians "borrowed" MCMC from statistical physics, where it had been flourishing in non-statistical problems since the time of Metropolis et al.. Fundamentally, MCMC is just a way to build a pseudo-random number generator; the algorithm is a Big Deal because it lets you build one for complicated, multivariate distributions in an apparently straightforward fashion. (I say "apparently straightforward" because there's lots to worry about in the actual implementation for complex problems, even though the Metropolis-Hastings algorithm most commonly used for MCMC is incredibly simple.) It just so happens that Bayesian calculations often require integrals of complicated, multivariate distributions, so MCMC has become closely associated with Bayes. One of my first introductions to MCMC was a review paper by D. Toussaint, "Introduction to algorithms for Monte Carlo simulations and their application to QCD" (Computer Physics Communications, v56, 69-92 (1989)). The first few pages have a physics-flavored intro to the basic ideas of MCMC, motivated by problems in statistical physics and lattice QCD (not a prior or likelihood in sight!). I still send astronomer and physicist colleagues to this paper (among others), to see MCMC ideas in our language. As with Finkebeiner's lecture, a key element in Toussaint's presentation is detailed balance. It's a nice paper, taking the reader from basic Monte Carlo to the Metropolis algorithm, Langevin methods, and ultimately hybrid Monte Carlo, in not very many pages. (It's perhaps worth noting that reversibility (detailed balance) is a sufficient but not necessary condition for a Markov chain to have a desired target distribution be its stationary distribution. There has been very little exploration of non-reversible algorithms; Diaconis, Holmes and Neal had a paper on this not long ago.) Hyunsook, that’s an encouraging story—that MCMC is being taught so effectively to astronomers at Harvard. As you noted, there’s nothing about MCMC that is particularly Bayesian (in fact, even Bayesian use of it is typically frequentist in flavor), so there’s no need to start with Bayes’s theorem. After all, (Bayesian) statisticians “borrowed” MCMC from statistical physics, where it had been flourishing in non-statistical problems since the time of Metropolis et al..

Fundamentally, MCMC is just a way to build a pseudo-random number generator; the algorithm is a Big Deal because it lets you build one for complicated, multivariate distributions in an apparently straightforward fashion. (I say “apparently straightforward” because there’s lots to worry about in the actual implementation for complex problems, even though the Metropolis-Hastings algorithm most commonly used for MCMC is incredibly simple.) It just so happens that Bayesian calculations often require integrals of complicated, multivariate distributions, so MCMC has become closely associated with Bayes.

One of my first introductions to MCMC was a review paper by D. Toussaint, “Introduction to algorithms for Monte Carlo simulations and their application to QCD” (Computer Physics Communications, v56, 69-92 (1989)). The first few pages have a physics-flavored intro to the basic ideas of MCMC, motivated by problems in statistical physics and lattice QCD (not a prior or likelihood in sight!). I still send astronomer and physicist colleagues to this paper (among others), to see MCMC ideas in our language. As with Finkebeiner’s lecture, a key element in Toussaint’s presentation is detailed balance. It’s a nice paper, taking the reader from basic Monte Carlo to the Metropolis algorithm, Langevin methods, and ultimately hybrid Monte Carlo, in not very many pages.

(It’s perhaps worth noting that reversibility (detailed balance) is a sufficient but not necessary condition for a Markov chain to have a desired target distribution be its stationary distribution. There has been very little exploration of non-reversible algorithms; Diaconis, Holmes and Neal had a paper on this not long ago.)

]]>
By: hlee http://hea-www.harvard.edu/AstroStat/slog/2008/eotw-line-emission/comment-page-1/#comment-226 hlee Wed, 14 May 2008 21:34:20 +0000 http://hea-www.harvard.edu/AstroStat/slog/?p=295#comment-226 Irrelevant, but I want to share my experience from an astronomy class (SP '08) taught by Prof. Finkbeiner at Harvard. As the semester approached to the end, the professor asked students topics to be covered, one of them was MCMC. I became curious how he'll introduce and cover the vast world of MCMC to students in an hour or so. I thought he'll start with Bayes theorem, which easily bore off students at the beginning (Although Bayes theorem is prevailed in daily life either consciously or unconsciously, when it comes to a <b>theorem</b> nobody gets excited). I think he was brilliant. He began his lecture from <b>detailed balance</b> (derived from these line emission equations if I recall correctly) to reach <b>posterior ~ likelihood*prior</b> without saying much about the statistical definitions of likelihoods, priors, and posteriors. Since the integrations basically require MCMC methodologies, he made a smooth transition and introduced Gibbs sampling and Metropolis-Hasting algorithm. The reason for telling my experience is that there is quite statistics about your post, although the equations are purely based on (atomic) physics. Irrelevant, but I want to share my experience from an astronomy class (SP ’08) taught by Prof. Finkbeiner at Harvard. As the semester approached to the end, the professor asked students topics to be covered, one of them was MCMC. I became curious how he’ll introduce and cover the vast world of MCMC to students in an hour or so. I thought he’ll start with Bayes theorem, which easily bore off students at the beginning (Although Bayes theorem is prevailed in daily life either consciously or unconsciously, when it comes to a theorem nobody gets excited). I think he was brilliant. He began his lecture from detailed balance (derived from these line emission equations if I recall correctly) to reach posterior ~ likelihood*prior without saying much about the statistical definitions of likelihoods, priors, and posteriors. Since the integrations basically require MCMC methodologies, he made a smooth transition and introduced Gibbs sampling and Metropolis-Hasting algorithm.

The reason for telling my experience is that there is quite statistics about your post, although the equations are purely based on (atomic) physics.

]]>