Sorry if I’ve missed something, but when running the toy example, it seems to only work on test data that has the same dimension as the original neural network.
Is there a way to change the size of the X data and y target vector to allow different size tensors to be used for X and y? or does it require a work around to use the weight posteriors in predictions?
X = tf.placeholder(tf.float32,[N,D], name=“X”)
y = Normal(loc=neural_network(X), scale=0.1 * tf.ones(N), name=“y”)