A simple tensorflow implementation of forward-backward


I’ve written a fairly simple tensorflow implementation of the forward and backward passes for a standard discrete HMM. It’s a small contribution, but hopefully helpful in some way. What improvements or modifications would make this more useful? (Or, in general, what work would be most useful to improve HMM/time-series support?)


I’ve been looking at implementing infinite hidden state HMMs (see here). This would definitely help with parallelization, would love some help with the rest!


I’d definitely be willing to help out where I can, even if it’s not strictly Edward-related. I’m trying to get a feel for probabilistic state space models of all kinds right now as I think they’ll be necessary for some stuff I’m working on.

My absolute dream would be to have a framework general enough to deal with most of the models and algorithms described by Tom Minka here in a unified way, and composable with other non-dynamic models.