Mixing my own cost in eward.klqp inference?


#1

Can I add my own cost/likelihood function in the ‘optimizer’ and pass it to the ‘inference’ in edward?

for example,
suppose I have,

loss = … # my custome cost
inference = ed.KLqp(…) # the kl inference from edward

then I choose a optimizer from tf,

optimizer = tf.train.RMSPropOptimizer(learning_rate).minimize(loss)

what happen after,

inference.initialize(optimizer=optimizer)

will the information from my ‘loss’ actually past to the ‘inference’ when I run ‘inference.update’?

apologize if it is a non-trival question, since I did not read the source code very carefully!


#2

No, edward builds the loss function and gradients itself inside VariationalInference.initialize, and only calls optimizer.apply_gradients

If you’ve defined the loss yourself, it would be easier to simply perform the optimization in tensorflow directly.


#3

Hi @JHChen check this similar question How to implement self-defined metric/loss function?