Black box likelihood


I currently use PYMC3 to use NUTS on a likelihood function and it’s gradients that I have a C++ function calculate. (I can’t use tensorflow because my calculations require arbitrary precision. Tensorflow’s arbitrary precision library tf-big requires Tensorflow no higher than 1.13 while edward2 requires Tensorflow 2. I also think the C++ implementation will be much faster anyway.) I have a class that collects the individual log-likelihoods and gradients and sums them so I get the complete log-likelihood score and gradient. I follow https://docs.pymc.io/notebooks/blackbox_external_likelihood.html

class LogLikeWithGrad(tt.Op):
    itypes = [tt.dvector] # expects a vector of parameter values when called
    otypes = [tt.dscalar] # outputs a single scalar value (the log likelihood)

    def __init__(self, times, muts):
        self.times = times
        self.muts = muts
        self.tele_likelihood = TelegraphLikelihood(times, muts)
        self.logpgrad = LogLikeGrad(self.tele_likelihood)

    def perform(self, node, inputs, outputs):
        theta, = np.exp(inputs)
        self.tele_likelihood.set_rates(theta[0], theta[1], theta[2])
        outputs[0][0] = self.tele_likelihood.get_log_likelihood_score()

    def grad(self, inputs, g):
        theta, = inputs
        return [g[0]*self.logpgrad(theta)]

class LogLikeGrad(tt.Op):
    itypes = [tt.dvector]
    otypes = [tt.dvector]

    def __init__(self, tele_likelihood):
        self.tele_likelihood = tele_likelihood

    def perform(self, node, inputs, outputs):
        theta, = np.exp(inputs)
        grads = self.tele_likelihood.get_gradients()
        grads = np.multiply(theta, grads)
        outputs[0][0] = grads

Is there something similar in Edward2? I can’t find any tutorials about blackbox likelihood functions. I would like to utilize Edward2’s capabilities, especially its usage of GPUs. It takes tens of hours for NUTS to finish on PYMC3, and Bayesian inference is one component of my analysis.