-
-
Notifications
You must be signed in to change notification settings - Fork 8
Flip reverse and forward for second-order AD #109
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
It looks like the Forward and Reverse backends had been switched
I don't think so, the goal is to always have a forward-over-reverse second order backend, so if the other half is not provided this code completes the |
A few problems with this
|
That's a fair point, I just wanted to point out that the problem isn't really a flip between modes. |
Optimization.jl does not appear to handle that at the moment julia> res = solve(prob, solver)
ERROR: MissingBackendError: Failed to use SecondOrder(AutoForwardDiff(), AutoForwardDiff()).
Please open an issue: https://github.com/gdalle/DifferentiationInterface.jl/issues/new
Stacktrace:
[1] _prepare_pushforward_aux(f::Function, backend::SecondOrder{…}, x::Vector{…}, tx::DifferentiationInterface.Tangents{…}, ::DifferentiationInterface.PushforwardFast)
@ DifferentiationInterface ~/.julia/packages/DifferentiationInterface/FTGtS/src/first_order/pushforward.jl:120
[2] prepare_pushforward(f::OptimizationBase.var"#_f#22"{…}, backend::SecondOrder{…}, x::Vector{…}, tx::DifferentiationInterface.Tangents{…})
@ DifferentiationInterface ~/.julia/packages/DifferentiationInterface/FTGtS/src/first_order/pushforward.jl:93
[3] _prepare_pullback_aux(f::OptimizationBase.var"#_f#22"{…}, backend::SecondOrder{…}, x::Vector{…}, ty::DifferentiationInterface.Tangents{…}, ::DifferentiationInterface.PullbackSlow)
@ DifferentiationInterface ~/.julia/packages/DifferentiationInterface/FTGtS/src/first_order/pullback.jl:104
[4] prepare_pullback(f::OptimizationBase.var"#_f#22"{…}, backend::SecondOrder{…}, x::Vector{…}, ty::DifferentiationInterface.Tangents{…})
@ DifferentiationInterface ~/.julia/packages/DifferentiationInterface/FTGtS/src/first_order/pullback.jl:93
[5] prepare_gradient(f::OptimizationBase.var"#_f#22"{…}, backend::SecondOrder{…}, x::Vector{…})
@ DifferentiationInterface ~/.julia/packages/DifferentiationInterface/FTGtS/src/first_order/gradient.jl:56
[6] instantiate_function(f::OptimizationFunction{…}, x::Vector{…}, adtype::SecondOrder{…}, p::Tuple{…}, num_cons::Int64; g::Bool, h::Bool, hv::Bool, fg::Bool, fgh::Bool, cons_j::Bool, cons_vjp::Bool, cons_jvp::Bool, cons_h::Bool, lag_h::Bool)
@ OptimizationBase ~/.julia/packages/OptimizationBase/3r1wm/src/OptimizationDIExt.jl:42
[7] instantiate_function
@ ~/.julia/packages/OptimizationBase/3r1wm/src/OptimizationDIExt.jl:28 [inlined]
[8] #instantiate_function#48
@ ~/.julia/packages/OptimizationBase/3r1wm/src/OptimizationDIExt.jl:294 [inlined]
[9] instantiate_function
@ ~/.julia/packages/OptimizationBase/3r1wm/src/OptimizationDIExt.jl:287 [inlined]
[10] OptimizationMOI.MOIOptimizationNLPCache(prob::OptimizationProblem{…}, opt::Ipopt.Optimizer; mtkize::Bool, callback::Nothing, kwargs::@Kwargs{…})
@ OptimizationMOI ~/.julia/packages/OptimizationMOI/tpWKG/src/nlp.jl:122
[11] MOIOptimizationNLPCache
@ ~/.julia/packages/OptimizationMOI/tpWKG/src/nlp.jl:108 [inlined]
[12] #__init#42
@ ~/.julia/packages/OptimizationMOI/tpWKG/src/OptimizationMOI.jl:302 [inlined]
[13] __init
@ ~/.julia/packages/OptimizationMOI/tpWKG/src/OptimizationMOI.jl:293 [inlined]
[14] #init#657
@ ~/.julia/packages/SciMLBase/SQnVC/src/solve.jl:174 [inlined]
[15] init
@ ~/.julia/packages/SciMLBase/SQnVC/src/solve.jl:172 [inlined]
[16] solve(::OptimizationProblem{…}, ::Ipopt.Optimizer; kwargs::@Kwargs{})
@ SciMLBase ~/.julia/packages/SciMLBase/SQnVC/src/solve.jl:96
[17] solve(::OptimizationProblem{…}, ::Ipopt.Optimizer)
@ SciMLBase ~/.julia/packages/SciMLBase/SQnVC/src/solve.jl:93
[18] top-level scope
@ REPL[4]:1
Some type information was truncated. Use `show(err)` to see complete types. EDIT: @gdalle actually, it asks me to open an issue at DifferentiationInterface.jl, is this combination Forward over Forward supposed to work? |
Yeah this combination works, you can disregard the message in this case. It's probably due to OptimizationBase not having a special case for passing SecondOrder. We can discuss it together @Vaibhavdixit02 if you like. |
I'll handle explicit second order and conditional checking of loaded modules before choosing Reverse mode in #110 |
It looks like the Forward and Reverse backends had been switched