Gaussian mixture with local parameters and global parameters via klqp method

We would like to implement the BBVI, which inferences simultaneously local parameters and global parameters in GMM.

https://github.com/EmergentSystemLabStudent/BBVI_edward/blob/master/GMM.ipynb

We have already implemented the BBVI, which inferences only local parameters in GMM. (In [3])

We have already implemented the BBVI, which inferences only global parameters in GMM. (In [7])

However, we could not implement the BBVI, which inferences
simultaneously local parameters and global parameters in GMM.
The code(In [11]) and the error messages(In[12]) are shown below.

#generative model
alpha = tf.constant([1.0, 1.0, 1.0])
pi = Dirichlet(concentration = alpha)
mu = [MultivariateNormalFullCovariance(loc=tf.constant([3.0,3.0]),covariance_matrix=tf.constant([[1.0,0.0],[0.0,1.0]]))for k in range(K)]
z = [Categorical(probs = pi) for n in range(N)]
x = [Mixture(cat=z[n],
components=[MultivariateNormalFullCovariance(loc=mu[k],covariance_matrix=sigma[k]) for k in range(K)]) for n in range(N)]
#variational model
lambda_pi = tf.nn.softplus(tf.Variable([0.0 , 0.0 , 0.0]))
qpi = Dirichlet(concentration = lambda_pi)
qmu = [MultivariateNormalFullCovariance(loc=tf.Variable([4.0,4.0]),covariance_matrix=tf.constant([[1.0,0.0],[0.0,1.0]])) for k in range(K)]
y = [tf.Variable([0.0,0.0,0.0]) for n in range(N)]
lambda_z = [tf.nn.softmax(y[n]) for n in range(N)]
qz = [Categorical(probs = lambda_z[n]) for n in range(N)]
latent_vars = {z[n]:qz[n] for n in range(N)}
latent_vars[pi] = qpi
for k in range(K):
latent_vars[mu[k]]=qmu[k]
data = {x[n]:x_data[n] for n in range(N)}
inference = ed.KLqp(latent_vars=latent_vars,data=data)
inference.initialize(n_iter=100)

TypeError: cat must be a Categorical distribution, but saw: Tensor(“inference_4/sample_4/Categorical_111/sample/Reshape_2:0”, shape=(), dtype=int32)

Please tell me a good idea to solve this problem.

1 Like