-
Notifications
You must be signed in to change notification settings - Fork 3
Guided Filter Implementation #74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…/AnalyticFilters.jl into ck/maximum-likelihood
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Really cool stuff ! I'll have a look in details, the main issue we had last time was the integration with the SSMProblem interface |
I managed to get the deep Gaussian proposal working with Mooncake (with a minor tweak to I figured I could use the feedback before committing to something along the lines of Frederic's comment. Let me know what you think, and feel free to add to this PR. |
|
||
function GeneralisedFilters.distribution( | ||
model::AbstractStateSpaceModel, | ||
kernel::DeepGaussianProposal, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The problem is that we have to write different models for different filters ? I cannot just write one distribution
and switch between the bootstrap filter or the guided filter ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure I'm following...
This distribution is defining the proposal, so you wouldn't want to use this in a bootstrap filter.
If you have some latent dynamics defined, you can either put that into the bootstrap filter or the guided filter, but with the latter you must also include a latent dynamics object to represent the proposal.
I guess the only issue that the latent dynamics for the proposal is a different sort of object to the usual latent dynamics since it conditions on the observation.
Could we pass that in as a kwarg? Treating it as some sort of control?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@FredericWantiez You will have to design an AbstractProposal
to be specific to a given model; regardless, this isn't the case for the model. You can still plug the model into the bootstrap filter.
Could we pass that in as a kwarg? Treating it as some sort of control?
I also want to be clear, distribution
will dispatch differently because we require both the model as well as the observation, so there should be zero conflict between interfaces. Ignoring information from the observation by default contradicts the purpose of the guided filter.
If either of you could construct an example where this breaks, that would be super helpful.
I implemented @FredericWantiez's suggestion from the last version of this PR. Although, I had to change the behavior of predict in order to ensure unit tests pass. I'm not very convinced this is the best approach, so feel free to jump in and critique. I essentially changed it so that we define the behavior of predict to expect predict(rng, model, algo, step, state, observation; kwargs...) It's not the perfect solution, so this can definitely be reverted. Although this will allow us to design an RBPF with a proposal, ensuring composability. Like I said above, suggestions will be highly appreciated. |
it was me Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Unit tests pass locally and changes are merged from main. Changes do not affect the interface (besides the aforementioned observation being passed to predict). Please let me know if there are any questions. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only minor nit-picks in my review. Overall this looks great.
Could you please add a unit test on a linear Gaussian model?
I agree with @FredericWantiez's comment that it's a bit strange to have the proposal distribution dispatch on the model.
Although I guess there is nothing stopping you from defining your proposal distribution on <:AbstractSSM
, we just need to make sure this is documented in a tutorial/example.
I can see the benefits of letting it depend on the model though. Your proposal could have parameters that are the same/functions of the model parameters, e.g. a drift term.
This is the approach outlined in #73 where the
update
andpredict
define the filter with a marginalization term subtracted from the logsumexp of the previously filtered states. This version changes the generalstep
method forAbstractParticleFilters
since every algorithm so far can be easily adapted.I intend on changing the proposal interface to something more along the lines of what @FredericWantiez suggested the first time I proposed this algorithm in this PR. We can iterate on that once the filtering component of this is figured out ala #73.
Lastly, I also included a demonstration of the guided filter by implementing VSMC with a tunable proposal. There are a number of issues with this first pass, but it serves as a nice demonstration of the flexibility of this interface.