use gaussian mixture as prior
See original GitHub issuehi, I want to use a Gaussian mixture as prior and defined prior_transform pushing a sample, u, in hypercube to it. I used the first dim of u to select a component from the mixture and then map the u to that Gaussian component. It works and I will show my result later.
My question is if my approach is correct in principle? Basically I cut the hypercube into several chunks and each chunk will be mapped to one of the Gaussian components. Will it cause troubles when shrinking likelihood contour in the hypercube to explore a new live point?
The dimension of my model is 6. I place a bimodal bi-variate Gaussian mixture on (x1, x2):
# Gaussian mixture prior for (x1, x2); covariance is shared.
weights = [.5, .5]
mu1 = np.array([0.0, 0.0])
mu2 = np.array([5.0, 0.0])
cov = np.array([[1.0, 0.0],
[0.0, 0.00001]])
The other three factors in my model are implemented by conditional prior and loglikelihood: Conditional prior: 1. translation from (x1, x2) to (x3, x4): (x3, x4) = (x1, x2) + N(mu = [20, 0], sigma = cov) 2. translation from (x3, x4) to (x5, x6): (x5, x6) = (x3, x4) + N(mu = [0, 20], sigma = cov) Log-likelihood 1. observed translation between from (x1, x2) to (x5, x6): (x5, x6) = (x1, x2) + N(mu = [20, 20], sigma = cov)
Issue Analytics
- State:
- Created 2 years ago
- Comments:5 (1 by maintainers)
My suggestion would be to define instead a single Gaussian prior G1(theta) where the mean/covariance are determined from the means/covars of your two matrices and then use the likelihood function which is L(theta) * G2(theta)/G1(theta) where G2(theta) is your two-gaussian prior.
problem resolved! Thanks guys! I may get back to here after some tests.