If you want a neural net with a Poisson likelihood at the end, you can check out http://edwardlib.org/tutorials/mixture-density-network. The tutorial defines a neural net that bottoms out to a mixture of normal likelihoods. You can apply the same idea.
I assume by Bayesian Poisson regression you mean placing priors over the neural net weights. This depends on how you’re defining them. If you’re using a higher-level API such as Keras or TensorFlow Slim, I’m not sure if it’s possible to pass in weights. If you’re defining the neural net layers manually, you can simply use, e.g.,
Normal(...) instead of
tf.Variable(...) to form the matrix multiplications. See, e.g., http://edwardlib.org/getting-started.