question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

use gaussian mixture as prior

See original GitHub issue

hi, I want to use a Gaussian mixture as prior and defined prior_transform pushing a sample, u, in hypercube to it. I used the first dim of u to select a component from the mixture and then map the u to that Gaussian component. It works and I will show my result later.

My question is if my approach is correct in principle? Basically I cut the hypercube into several chunks and each chunk will be mapped to one of the Gaussian components. Will it cause troubles when shrinking likelihood contour in the hypercube to explore a new live point?

The dimension of my model is 6. I place a bimodal bi-variate Gaussian mixture on (x1, x2):

# Gaussian mixture prior for (x1, x2); covariance is shared.
weights = [.5, .5]
mu1 = np.array([0.0, 0.0])
mu2 = np.array([5.0, 0.0])
cov = np.array([[1.0, 0.0],
              [0.0, 0.00001]])

The other three factors in my model are implemented by conditional prior and loglikelihood: Conditional prior: 1. translation from (x1, x2) to (x3, x4): (x3, x4) = (x1, x2) + N(mu = [20, 0], sigma = cov) 2. translation from (x3, x4) to (x5, x6): (x5, x6) = (x3, x4) + N(mu = [0, 20], sigma = cov) Log-likelihood 1. observed translation between from (x1, x2) to (x5, x6): (x5, x6) = (x1, x2) + N(mu = [20, 20], sigma = cov)

myplot3 myplot2 myplot1

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:5 (1 by maintainers)

github_iconTop GitHub Comments

2reactions
segasaicommented, Jul 15, 2021

My suggestion would be to define instead a single Gaussian prior G1(theta) where the mean/covariance are determined from the means/covars of your two matrices and then use the likelihood function which is L(theta) * G2(theta)/G1(theta) where G2(theta) is your two-gaussian prior.

0reactions
doublestrongcommented, Jul 24, 2021

problem resolved! Thanks guys! I may get back to here after some tests.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Gaussian mixture as a prior of gaussian - Cross Validated
This you can see in equation (1): We have a convex combination of two Gaussian measures there.
Read more >
[1405.4895] Bayesian inference of Gaussian mixture models ...
This paper deals with Bayesian inference of a mixture of Gaussian distributions. A novel formulation of the mixture model is introduced, which ...
Read more >
2.1. Gaussian mixture models - Scikit-learn
A Gaussian mixture model is a probabilistic model that assumes all the data points are generated from a mixture of a finite number...
Read more >
An Improved Gaussian Mixture Model Based on Prior ...
The improved Gaussian mixture model based on prior probability factor was proposed here for the segmentation of white matter (WM), ...
Read more >
Gaussian Mixture Models Explained - Towards Data Science
Gaussian Mixture Models are a very powerful tool and are widely used in diverse tasks that involve data clustering. I hope you found...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found