Open
Description
Say we have
using MeasureTheory, BenchmarkTools
m = For(1:1000) do j
Normal(μ=j)
end
x = rand(m)
Then compare
julia> @btime logdensityof($m, $x)
3.114 μs (0 allocations: 0 bytes)
-1414.4176763297564
julia> @btime sum(j -> logdensityof(Normal(μ=j), $x[j]), 1:1000)
684.154 ns (0 allocations: 0 bytes)
-1414.4176763297567
The For
should be faster, because it knows about the common base measure. What's going on here?
Note that "slowness" is only relative to another approach in MeasureTheory. We're still well ahead of Distributions:
julia> @btime sum(j -> logdensityof(Dists.Normal(j, 1), $x[j]), 1:1000)
7.166 μs (0 allocations: 0 bytes)
-1414.4176763297567
Metadata
Metadata
Assignees
Labels
No labels