Nan loss in Weighted Stochastic Block Model

Hi,
I tried Weighted Stochastic Block Model by example/sbm.py as refer.
I choose model as Gamma-Poission conjugate model for weighted graph as follows.
# MODEL

Dirichlet-Multinomial conjugate model

gamma = Dirichlet(concentration=tf.ones([K]))
Z = Multinomial(total_count=1., probs=gamma, sample_shape=N)

Beta-Bernoulli conjugate model

#Pi = Beta(concentration0=tf.ones([K, K]), concentration1=tf.ones([K, K]))
#X = Bernoulli(probs=tf.matmul(Z, tf.matmul(Pi, tf.transpose(Z))))

#Gamma-Poission conjugate model
Pi = Gamma(concentration=tf.ones([K, K]), rate=tf.ones([K, K]))
X = Poisson(rate = tf.matmul(Z, tf.matmul(Pi, tf.transpose(Z))))

INFERENCE (EM algorithm)

qgamma = PointMass(params=tf.nn.softmax(tf.Variable(tf.random_normal([K]))))
qPi = PointMass(params=tf.nn.sigmoid(tf.Variable(tf.random_normal([K, K]))))
qZ = PointMass(params=tf.nn.softmax(tf.Variable(tf.random_normal([N, K]))))

inference = ed.MAP({gamma: qgamma, Pi: qPi, Z: qZ}, data={X: X_data})

n_iter = c.n_iter
inference.initialize(n_iter=n_iter)

tf.global_variables_initializer().run()

for _ in range(inference.n_iter):
info_dict = inference.update()
inference.print_progress(info_dict)
inference.finalize()

With the result.
1000/1000 [100%] ██████████████████████████████ Elapsed: 13s | Loss: nan
Result / Cluster Number:
[0 0 0 …, 0 0 0]

I would like to know some tips such as more stable model or ed.MAP algorithm.

Thanks for any help.