question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Using an ensemble sampler within a Gibbs step

See original GitHub issue

After talking to @davidwhogg about hierarchical modelling and Gibbs sampling at @astrohackweek, I’ve got some implementation questions. #1. Can ensemble samplers be used in both/all phases of a Gibbs step?

For example, I have ensemble samplers for parameters local to each pixel. Each sampler has N_w walkers. Now how do I pass the last step from the pixel samplers on to the next phase that samples in global parameters? Specifically, what goes in the args to the run_mcmc method? I see two options:

  1. Just take the state from a single walker (the first one? or a random one?) and pass that on to the global sampler.
  2. If the ensemble samplers at the local and global levels both have N_w walkers then I could ‘align’ the walkers of each. Hence the final state of walker i at the local level would be used by walker i at the global level. I think this is more appealing but is means that I’d need some way of telling the posterior function which walker it belongs to so it can choose the appropriate parameters from the args. @dfm, is there already a way to do this?

Or is it A Bad Idea ™️ to use an ensemble sampler within Gibbs (as I think @dfm alluded in #106, though @davidwhogg seemed optimistic)? #2. How can ensemble samplers be used as a step with a non-ensemble sampler in another step?

If I use an ensemble level for one level of the Gibbs step and a Metropolis-Hastings in another, what parameters do I pass on to MH? Do I always take the parameters at the last step of the i-th walker? Or pick a random walker? (This question mirrors option 1 from above). #3. Is it a special case to have an ensemble sampler step go into a trivial Gaussian Process step?

As @davidwhogg pointed out, in one of my global Gibbs phases I can make a sample from a normal distribution defined by model-observation residuals. In computing these residuals, should I:

  1. marginalize over the models of all walkers to make a single new sample
  2. just pick a single walker to make a single new sample, or
  3. make a new value for each walker (which means that again, the posterior function for the ensemble sampler at the next level would need to know what walker it belongs to).

(This is a question directed mostly to @davidwhogg and @dfm but anyone is welcome to chime in!)

Issue Analytics

  • State:closed
  • Created 9 years ago
  • Comments:22 (15 by maintainers)

github_iconTop GitHub Comments

2reactions
jonathansickcommented, Mar 23, 2017

Hey @RuthAngus, I ended not using Emcee and instead just implemented my own Metropolis-Hastings in Gibbs (contrary to everyone’s advice to look for another off-the-shelf solution 😉 ).

In my problem with modeling the stellar population in N pixels, with a background parameter that applies to all pixels I do this:

  1. For each stellar population parameter of each pixel, I do an MH step to propose a new parameter. So at least I can do my N pixels in parallel, but the MH step is always a single parameter.
  2. When the stellar population parameters of each pixel are all updated, I go up a level and estimate a new background. It turns out that I just need a linear estimator for this.
  3. Using the new background parameter, I go back to the pixel level and serially do MH steps again through stellar population parameters of each pixel.
  4. Re-estimate background level, and repeat.

This ends up being very simple to implement and was really all I needed. I’m not sure if every project could afford to break up their multi-parameter steps into a series of single parameter MH steps, but it worked for me.

0reactions
guro86commented, Dec 6, 2022

Hello, I have a hierarchical problem and am intrigued by using a Gibbs sampler with an inner MH (or emcee) step. I see that you have been discussing this problem since a while back. Does anyone have any example implementation to share? @guillochon, you mentioned that you had success with this, do you still have the code to share?

I really appreciate any help you can provide.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Gibbs sampling | Xi'an's Og
To recap, ensemble sampling moves a cloud of points (just like our bouncy particle sampler) one point X at a time by using...
Read more >
slice sampling within a Gibbs sampler - Cross Validated
Use a separate and independent univariate slice sampler to draw y from p(y|x). In case there is any confusion with multivariate slice sampling, ......
Read more >
Gibbs sampler revisit
lecture discusses a variety of tricks for designs using Gibbs and Metropolis. ... Sampling in each dimension according to a conditional probability ...
Read more >
Zhu & Menard (2013)
distribution, in particular, Markov Chain Monte ... No need to use uniform sampling, can just as easily ... Metropolis-Hastings and Gibbs sampling are....
Read more >
The Gibbs Centroid Sampler - PMC - NCBI
In so doing, it garners information from the full ensemble of solutions, rather than only the single most probable point that is the...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found