Mixture model where weights are dependent on covariates


Hello all,

I am new to Edward, but I was wondering if anyone has had any success with a regression mixture model (with Gaussian errors) where the mixture weights are also determined by the covariates via a softmax function. I will try to explain it using some notation (forgive me if the typesetting is off).

y_i ~ Sum[ (w_k | X[i, :]) * Normal(alpha_k + dot(beta_k, X[i, :], sigma_k) ]


  1. i represents the index of the observation
  2. k represents the index of the mixing cluster, and there are k = 1…K mixing classes where K is known in advance (i.e. this is a finite mixture)
  3. the response variable, y, has i = 1…N elements
  4. the predictors, X, have rows indexed by i (with N elements) and columns indexed by j = 1…M elements.

This is different than a conventional gaussian mixture in that the means are not constant, and the regression is heteroskedastic.

While I have run many of the Edward tutorial models, I get a little confused on the interchangeability of Tensorflow variables from Edward variables. If anyone could start a basic template model, I would be glad to run it and offer code review on how it works with data.

Any help would be appreciated.