-
Notifications
You must be signed in to change notification settings - Fork 218
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update to tilde overloads in mh.jl
#2360
base: ch
Are you sure you want to change the base?
Conversation
# Just defer to `SampleFromPrior`. | ||
retval = DynamicPPL.dot_assume(rng, SampleFromPrior(), dist, vns[1], var, vi) | ||
# Update the Gibbs IDs because they might have been assigned in the `SampleFromPrior` call. | ||
DynamicPPL.updategid!.((vi,), vns, (spl,)) | ||
# Return. | ||
return retval |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm wondering if we maybe should just move this "default" impl which uses SampleFromPrior
+ updategid!
to DynamicPPL.jl itself. Thoughts @penelopeysm @mhauru ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would this be useful for multiple samplers beyond MH?
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## ch #2360 +/- ##
==========================================
+ Coverage 83.86% 84.36% +0.50%
==========================================
Files 24 24
Lines 1580 1573 -7
==========================================
+ Hits 1325 1327 +2
+ Misses 255 246 -9 ☔ View full report in Codecov by Sentry. |
It occurs to me that the value generated by I don't know if there's an easy fix for this at all though. |
|
Oh, I see, I'm just confused. The values are stored in varinfo edit; oh, I see it's this function that's doing it. Lines 267 to 282 in 40a0d84
|
Sorry, not trying to be annoying and get a final word in, but having spent a good amount of time figuring out the interplay between Turing and AdvancedMH I feel like this behaviour is a bit unexpected (and I actually now understand why it happens 😂): @model function gdemo(x, y)
s² ~ InverseGamma(2, 3)
m ~ Normal(0, sqrt(s²))
x ~ Normal(m, sqrt(s²))
y ~ Normal(m, sqrt(s²))
end
chain = sample(
gdemo(1.5, 2.0),
MH(:m => AdvancedMH.RandomWalkProposal(Normal(0, 0.25))),
10
) Here |
@penelopeysm I was looking at your PR #2341 and found some bad bugs in the existing codebase for the
MH
, so I figured we should just get these fixed too.