Hello,

I have a problem to solve, where I am trying to constraint posterior based on prior distribution and relations between my random variables. I am starting in edward so I wanted to test a very simple case:

3 random variables (uniform in [0,100], A,B and C that I want to constraint such that A+B+C=100

Sometimes, I also know the value of either B, C or both.

So I’ve tried to implement my problem as bellow:

#model

A = Uniform(low=0.0, high=100.0)

B = Uniform(low=0.0, high=100.0)

C = Uniform(low=0.0, high=100.0)

SUM = A+B+C

#Posterior distributions

NToys = 15000

Aq = Empirical(tf.Variable(tf.zeros(NToys)))

Bq = Empirical(tf.Variable(tf.zeros(NToys)))

Cq = Empirical(tf.Variable(tf.zeros(NToys)))

#inference where sum is constrain to 100 and B to 5 and C to 15

inference = ed.HMC(

{A: Aq}, #model

data={B:5.0, C:15.0, SUM: 100.0}) #observation

inference.run()

#print posterior mean and std

print(“A”, Aq.params[14000:].eval().mean(), Aq.params[14000:].eval().std())

Unfortunately, when I run this, I get

A Mean=28.692398 Std=13.735088

which seems too far from the actual solution (A=80) for such a simple problem

What am I missing?

How would you implement my problem for variational inference (I also tried, but the results where even worse )

Thanks for helping,

Loic