Sampling Outcomes with new data in Bayesian NN

Sorry if I’ve missed something, but when running the toy example, it seems to only work on test data that has the same dimension as the original neural network.

Is there a way to change the size of the X data and y target vector to allow different size tensors to be used for X and y? or does it require a work around to use the weight posteriors in predictions?

X = tf.placeholder(tf.float32,[N,D], name=“X”)
y = Normal(loc=neural_network(X), scale=0.1 * tf.ones(N), name=“y”)

Don’t declare the data set size in the placeholder.

X = tf.placeholder(tf.float32, [None, D], name=“X”)

For the random variable, you can broadcast additional parameters appropriately or have the internals do it for you:

y = Normal(loc=neural_network(X), scale=0.1, name=“y”)

An example off the top of my head is examples/lstm.py.

Thanks so much… that makes it very clear.

1 Like