RuntimeError: Random walk sampling appears to be stuck!
See original GitHub issueHi,
I’m attempting to get dynesty to converge “nicely”. My model has 19 parameters and I run it with:
ndim = 19
dsampler = dynesty.DynamicNestedSampler(ln_like, prior_transform, ndim = ndim,bound='single',sample='rwalk',walks=50)
dsampler.run_nested(dlogz_init=1e-10, nlive_init=4000, nlive_batch=4000,wt_kwargs={'pfrac': 0.95})
Is setting dlogz_init=1e-10
an unrealistic value?
I’m not too sure how to get dynesty to always converge e.g. for mcmc I’d just increase the walkers and steps but I’ve found for dynesty that I need to alter the scale of the log likelihood function - where logl = non-normalised chi2 - otherwise the code runs for a relatively minimal time and the convergence isn’t great. If I leave the logl as just the non-normalised chi2 however, it prints the Runtime error
. Any tips would be greatly appreciated.
Here’s the full print out (since the code says it may be useful)
RuntimeError: Random walk sampling appears to be stuck! Some useful output quantities:
u: [0.61211578 0.29883424 0.30418985 0.26553322 0.77596979 0.12880671
0.53166602 0.55272972 0.59784004 0.47026351 0.49768702 0.46909061
0.0492889 0.6416871 0.47095095 0.39649497 0.56340283 0.58003664
0.43703319]
drhat: [ 0.51942188 0.20292128 -0.25131422 -0.0462435 -0.04087769 -0.03899258
-0.28354396 -0.2750504 -0.45671211 0.22560593 0.04020417 -0.13088042
0.05010286 0.10831015 0.06636851 -0.0941852 0.0129034 0.33036297
-0.22233308]
dr: [ 0.50421269 0.19697954 -0.24395549 -0.04488944 -0.03968075 -0.03785084
-0.27524151 -0.26699665 -0.44333913 0.21899996 0.03902695 -0.12704811
0.0486358 0.10513872 0.06442517 -0.09142736 0.01252557 0.32068962
-0.21582295]
du: [ 6.34458514e-03 1.41546553e-04 -3.43680742e-04 -2.97290180e-04
-2.27000337e-03 -1.17278180e-03 -1.36977299e-03 -7.22953104e-05
-2.04123144e-04 4.20805157e-04 -4.65647849e-05 -1.46280561e-04
-2.21868830e-03 1.99202809e-03 -7.68809013e-04 1.79377016e-03
8.00802805e-05 1.15701526e-03 -8.07340036e-04]
u_prop: [0.61211578 0.29883424 0.30418985 0.26553322 0.77596979 0.12880671
0.53166602 0.55272972 0.59784004 0.47026351 0.49768702 0.46909061
0.0492889 0.6416871 0.47095095 0.39649497 0.56340283 0.58003664
0.43703319]
loglstar: -989554.5
logl_prop: -989554.8125
axes: [[ 1.25831524e-02 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00]
[-9.59494844e-06 7.43145437e-04 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00]
[ 3.71301087e-05 -1.15653354e-03 5.51693936e-04 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00]
[ 6.94499246e-06 -1.69150355e-04 8.73413246e-04 1.21183982e-03
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00]
[ 2.73957906e-04 -8.60692885e-03 2.74503412e-03 -8.04795262e-06
1.09480371e-03 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00]
[ 1.24898412e-04 -5.24219719e-03 7.02620596e-04 -3.72378230e-04
5.08216528e-04 7.47482506e-04 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00]
[-3.19867398e-04 -1.66256241e-03 -4.87165649e-04 6.37138555e-04
-2.21943007e-03 4.31434478e-04 3.78934642e-03 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00]
[ 6.99011461e-05 7.50939116e-05 -3.75765342e-04 -5.11191945e-04
8.43199529e-05 -4.94147094e-05 -3.04046425e-04 1.19537144e-03
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00]
[ 2.09550450e-05 3.67508706e-05 -3.12002971e-05 -1.24150632e-04
1.43859932e-04 8.81744894e-05 -2.82908663e-04 1.03809279e-04
6.23040387e-04 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00]
[ 3.05516274e-05 2.70754057e-04 1.29882321e-04 -1.00638958e-04
2.85437772e-04 7.13926851e-07 -5.20945605e-04 -1.08674654e-04
-1.27351404e-04 7.38481547e-04 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00]
[-1.38433571e-04 -3.04582253e-05 -1.36178139e-05 -2.89814475e-05
7.80309440e-05 -3.20340557e-05 3.34764725e-05 -1.31236383e-05
-7.06886466e-06 -3.81492237e-05 9.58990041e-04 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00]
[-1.92016931e-05 -7.64026977e-06 6.12720171e-05 -1.42674137e-04
-2.05555985e-05 3.68160631e-05 -7.12946160e-06 3.92718877e-05
2.26589176e-05 3.33146619e-06 3.96333729e-04 9.72871403e-04
0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00]
[-4.41093062e-03 1.89035515e-05 4.67148659e-06 -3.70515645e-06
-1.84964955e-05 -3.39352322e-06 1.38457554e-05 3.16455044e-06
8.71002817e-06 8.37528244e-06 -1.22932324e-05 7.23592550e-08
1.83369330e-04 0.00000000e+00 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00]
[-3.35641371e-04 6.84401918e-03 -3.36150827e-03 4.37332517e-04
-8.62379949e-04 1.61894799e-04 3.51631144e-04 -6.95863825e-05
1.58963842e-04 1.01999644e-04 6.54033393e-05 3.89800472e-05
1.73473518e-05 1.07010444e-03 0.00000000e+00 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00]
[ 1.77472187e-04 -4.75235561e-03 -3.20157828e-04 -3.03513420e-04
1.04439265e-03 1.67462704e-04 -9.77002001e-05 -1.03067467e-05
4.51667620e-05 1.45626808e-04 2.11614695e-05 -4.69488571e-05
-6.59153625e-05 -5.98547171e-04 8.02648703e-04 0.00000000e+00
0.00000000e+00 0.00000000e+00 0.00000000e+00]
[ 9.21585938e-05 1.20520033e-03 -6.63482858e-03 -7.46695507e-03
1.98472258e-04 5.08495839e-05 3.68956297e-04 3.08721385e-04
-2.93650482e-04 -3.06530234e-04 5.83505785e-05 1.67152074e-05
8.41808224e-05 -2.35639096e-04 -8.68709842e-04 2.58861780e-03
0.00000000e+00 0.00000000e+00 0.00000000e+00]
[-2.35606757e-04 -1.01496058e-03 2.06209330e-04 3.83653812e-04
-5.04916312e-04 -2.67252253e-03 -5.29116257e-04 -1.79550685e-04
-2.60915858e-04 -3.75345682e-04 -1.27468841e-04 -1.57467145e-04
7.57737881e-05 1.41913802e-04 -1.70889529e-03 -1.76620976e-03
2.64012072e-03 0.00000000e+00 0.00000000e+00]
[ 2.60843480e-04 -4.84038713e-05 7.51183672e-04 3.27388573e-04
5.05505263e-04 -5.59759483e-04 -1.10650854e-03 -6.76308608e-05
3.86106624e-05 -3.05519116e-04 -9.08987536e-05 -6.56174791e-05
-1.32180212e-04 -8.39534513e-05 -1.36644947e-04 -1.06686592e-03
1.91169322e-04 2.84573877e-03 0.00000000e+00]
[-1.11306400e-04 -2.84578527e-04 -3.70878506e-04 -5.95121814e-05
7.12390082e-04 8.54350560e-04 1.62642821e-03 -1.43116178e-04
2.63919121e-04 3.27381262e-04 7.70220025e-05 1.31537510e-04
1.39954314e-04 1.00476004e-03 -2.31798803e-04 4.33121317e-04
1.87592818e-04 -1.22857147e-05 1.46214361e-03]]
scale: 6.699929041550063e-13.
Issue Analytics
- State:
- Created 2 years ago
- Comments:9 (1 by maintainers)
Top Results From Across the Web
Sampling gets stuck after repeated instances of "UserWarning ...
Sampling gets stuck after repeated instances of "UserWarning: Random walk proposals appear to be extremely inefficient." #160.
Read more >Random Walk(Python Crash Course) stuck in what seems to ...
When I run the code, the command prompt gets stuck in a "processing" mode" which I cannot leave unless I close the window,...
Read more >How do I run Stable Diffusion and sharing FAQs - Reddit
I installed git manually but now when I do env create -f environment.yaml it seems to get stuck when installing pip dependencies. No...
Read more >emcee - Release 3.0.2
During sampling, at each step, a move is randomly selected from the mixture and used as the proposal. The default move is still...
Read more >Random walk on a cube | Physics Tom
This is easy to see if examine k=1,2,3,4 and 5. which is obvious - there is one path from 1 → 2, one...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Okay, I’ve read your previous posts in detail And it’s seems your likelihood f-n needs to be something like sum(-0.5*((data-model)^2/err^2 -ln(err))
where err is an extra-parameter for the noise of your data. (that is making an assumption that all your measurements are independent and share the same uncertainty.
@segasai thanks for the suggestion, I went with something similar and the code now nicely converges. Thanks!