Edward for Sequential Importance Resampling Particle Filter

Hi everyone!

Sorry if this is a noob question.

I was wondering if particle filters are built into Edward? Specifically I was hoping to use a sequential importance resampling algorithm and was hoping maybe Edward had something like that.

Thanks!

1 Like

Following our reference API page, we don’t. Contributions are welcome of course—and it would be highly impactful.

1 Like

Okay, thanks!

Hmm I’m not too familiar with TensorFlow but I’ll see if I can think of how to do it.

Hi Jonathan,
If you are ok with Matlab, have a look at SABL - Sequentially Adaptive Bayesian Learning. It implements a Sequential Monte-Carlo Bayesian inference algorithm and also makes use of importance sampling. You can read more about SABL at the following links:

https://sabl-projects-online.github.io/
https://www.uts.edu.au/about/faculty-science/what-we-do/our-research-areas/sequentially-adaptive-bayesian-learning-research

I too am learning about Edward and Tensorflow, and I am doing so with a view of porting SABL over to this platform. The Metropolis inference class in Edward looks to be a good starting point. Let me know if you are keen to collaborate.

Simon Yin

2 Likes

Hi Simon,

I’m okay with Matlab, but I’m not too well versed in machine learning stuff.

I’ve really only used the particle filter for my research project.

Sure, I would be interested in collaborating! I’m a graduate student currently so am not too sure about time commitments but I’ll try to contribute on a regular basis.

My Github is jonathan-j-deng.

Thanks,
Jon

Hi Jonathan,
Perhaps it’s best we kept the discussion here on the Edward forum and try to develop the Sequential Importance Resampling solution using Edward.

That way we are contributing back to this very impressive project and to the very helpful people who are behind it.

So I would put together your something that does the following:

  1. Take an initial sample for the particles from a prior distribution
  2. Evaluate these against an objective/loss function
  3. Propose better candidates based on the result of (2) and substitute these with some acceptance rate
  4. Repeat Steps 1-3 for some dynamically determined number of iterations
  5. Remove poor performing samples; and from the remaining particles, generate copies according to an “importance” weight + some random variation

This is in simple terms what SABL does in Matlab. But the attraction of Edward/Tensorflow is to give a basis with comparing with more recently developed inference methods such as KLqp and SGHMC. There is also the path to running on very low power devices which Tensorflow offers.

Can you describe what end application you are targeting ?

Hi,

I was wondering if there had been any progress made in this direction?

I want to use Particle Filter methods for time series, and the packages I come across do not satisfy our requirements so far. Example problem, we want to estimate multivariate time series covariance using a HMM.

Regards,
Arsen

Hi Arsen,

Sorry I haven’t gotten anything done in this direction yet. I don’t have a lot of familiarity with using TensorFlow.