Open
Description
Hello! I'm passing on this error originally reported at Turing.jl: TuringLang/Turing.jl#2364
Here's a simplified MWE that doesn't involve Turing at all. This example fails but when changing Real
to Float64
, it gives the correct derivative of 1.
using ReverseDiff: gradient
function f(u)
x = (Real[1.0, 2.0] * u[], u[])
return last(x)
end
ReverseDiff.gradient(f, [2.0])
Traceback
ERROR: MethodError: no method matching increment_deriv!(::Float64, ::Float64)
Closest candidates are:
increment_deriv!(::ReverseDiff.TrackedReal, ::Real)
@ ReverseDiff ~/.julia/packages/ReverseDiff/p1MzG/src/derivatives/propagation.jl:45
increment_deriv!(::AbstractArray, ::Any)
@ ReverseDiff ~/.julia/packages/ReverseDiff/p1MzG/src/derivatives/propagation.jl:38
increment_deriv!(::ReverseDiff.TrackedArray, ::Real, ::Any)
@ ReverseDiff ~/.julia/packages/ReverseDiff/p1MzG/src/derivatives/propagation.jl:34
...
Stacktrace:
[1] increment_deriv!
@ ~/.julia/packages/ReverseDiff/p1MzG/src/derivatives/propagation.jl:36 [inlined]
[2] broadcast_increment_deriv!(input::Vector{…}, x::Vector{…}, partial::Float64, input_bound::CartesianIndex{…}, ::Nothing)
@ ReverseDiff ~/.julia/packages/ReverseDiff/p1MzG/src/derivatives/propagation.jl:173
[3] special_reverse_exec!(instruction::ReverseDiff.SpecialInstruction{…})
@ ReverseDiff ~/.julia/packages/ReverseDiff/p1MzG/src/derivatives/elementwise.jl:533
[4] reverse_exec!(instruction::ReverseDiff.SpecialInstruction{Tuple{…}, Tuple{…}, ReverseDiff.TrackedArray{…}, Tuple{…}})
@ ReverseDiff ~/.julia/packages/ReverseDiff/p1MzG/src/tape.jl:93
[5] reverse_pass!(tape::Vector{ReverseDiff.AbstractInstruction})
@ ReverseDiff ~/.julia/packages/ReverseDiff/p1MzG/src/tape.jl:87
[6] reverse_pass!
@ ~/.julia/packages/ReverseDiff/p1MzG/src/api/tape.jl:36 [inlined]
[7] seeded_reverse_pass!(result::Vector{…}, output::ReverseDiff.TrackedReal{…}, input::ReverseDiff.TrackedArray{…}, tape::ReverseDiff.GradientTape{…})
@ ReverseDiff ~/.julia/packages/ReverseDiff/p1MzG/src/api/utils.jl:31
[8] seeded_reverse_pass!(result::Vector{…}, t::ReverseDiff.GradientTape{…})
@ ReverseDiff ~/.julia/packages/ReverseDiff/p1MzG/src/api/tape.jl:47
[9] gradient(f::Function, input::Vector{Float64}, cfg::ReverseDiff.GradientConfig{ReverseDiff.TrackedArray{…}})
@ ReverseDiff ~/.julia/packages/ReverseDiff/p1MzG/src/api/gradients.jl:24
[10] gradient(f::Function, input::Vector{Float64})
@ ReverseDiff ~/.julia/packages/ReverseDiff/p1MzG/src/api/gradients.jl:22
[11] top-level scope
@ REPL[4]:1
Some type information was truncated. Use `show(err)` to see complete types.
Version info
The error above occurs with a fresh environment containing only [email protected].
Julia Version 1.10.5
Commit 6f3fdf7b362 (2024-08-27 14:19 UTC)
Build Info:
Official https://julialang.org/ release
Platform Info:
OS: macOS (arm64-apple-darwin22.4.0)
CPU: 10 × Apple M1 Pro
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-15.0.7 (ORCJIT, apple-m1)
Threads: 1 default, 0 interactive, 1 GC (on 8 virtual cores)
Failing Turing model
The below is the simplest Turing model I could get to yield the same error. I'm including it here just in case I overly simplified the MWE above.
using Turing
@model function f(x)
u ~ Uniform(0, 1)
return x * u
end
# works with Float64, as above
sample(f(Real[1.0, 2.0]), NUTS(; adtype=AutoReverseDiff()), 10)
Metadata
Metadata
Assignees
Labels
No labels