Bernoulli Mixture Models on MNIST

Hi guys. I’m trying to recreate the Bernoulli Mixture Model over binarized MNIST digits from Section 9.3.3 of Bishop’s textbook in Edward, but am struggling with the ParamMixture() class.

So far, I have:

N = 10000
K = 10
D = 28*28

pi = Dirichlet(tf.ones(K), sample_shape=D)
mu = Beta(tf.ones(D),tf.ones(D),sample_shape=K)
x = ParamMixture(pi, {'probs': mu}, Bernoulli, sample_shape=N)
z = x.cat

This allows me to define a ParamMixture with the right number of dimensions. However, I get an error during training: Incompatible shapes: [10000,10,784] vs. [10000,784,10]

If I try to change any of the shapes in the model parameters, the ParamMixture complains; if I have a working ParamMixture, I get the error during inference.

In short: does anyone have an example of how to create a multi-dimensional Bernoulli Mixture Model? Any help would be greatly appreciated!

Can you report your Edward and TensorFlow version? I ran the code on 1.3.3 and the latest development version. Both ran successfully.

1 Like

Thanks for responding! To clarify, the code posted above runs fine, but I get the Incompatible shapes error when I run:

T = 500 # EDIT: declared T
qpi = Empirical(tf.Variable(tf.ones([T, D, K]) / K))
qmu = Empirical(tf.Variable(tf.zeros([T, K, D])))
qz = Empirical(tf.Variable(tf.zeros([T, N, D], dtype=tf.int32)))

Inference = ed.Gibbs({pi: qpi, mu: qmu, z: qz}, data={x: x_data})
inference.initialize()

sess = ed.get_session()
tf.global_variables_initializer().run()

for _ in range(inference.n_iter):
  info_dict = inference.update()
  inference.print_progress(info_dict)

I am running edward 1.3.3 and tensorflow 1.2.0-rc0 in virtualenv 15.1.0 under python 2.7.10, on macOS 10.12.1.

I just discovered the on-going discussion here, which discusses the same question:
https://github.com/blei-lab/edward/issues/686

1 Like