Skip to content

Flip reverse and forward for second-order AD #109

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

baggepinnen
Copy link
Contributor

It looks like the Forward and Reverse backends had been switched

It looks like the Forward and Reverse backends had been switched
@gdalle
Copy link
Contributor

gdalle commented Sep 28, 2024

I don't think so, the goal is to always have a forward-over-reverse second order backend, so if the other half is not provided this code completes the SecondOrder it with default choics

@baggepinnen
Copy link
Contributor Author

A few problems with this

  • How do I then accomplish forward over forward mode? My problem has 3 variables, the overhead of reverse mode makes it far worse than forward over forward
  • Using Optimization.AutoForwardDiff() will now error with "ReverseDiff not loaded", which is really unexpected, I haven't requested any use of ReverseDiff

@gdalle
Copy link
Contributor

gdalle commented Sep 28, 2024

That's a fair point, I just wanted to point out that the problem isn't really a flip between modes.
I'm not familiar with the internals of OptimizationBase (just trying to help Vaibhav adopt DifferentiationInterface), but for the first point maybe passing SecondOrder(AutoForwardDiff(), AutoForwardDiff()) could work?

@baggepinnen
Copy link
Contributor Author

baggepinnen commented Sep 28, 2024

Optimization.jl does not appear to handle that at the moment

julia> res = solve(prob, solver)
ERROR: MissingBackendError: Failed to use SecondOrder(AutoForwardDiff(), AutoForwardDiff()).
Please open an issue: https://github.com/gdalle/DifferentiationInterface.jl/issues/new
Stacktrace:
  [1] _prepare_pushforward_aux(f::Function, backend::SecondOrder{…}, x::Vector{…}, tx::DifferentiationInterface.Tangents{…}, ::DifferentiationInterface.PushforwardFast)
    @ DifferentiationInterface ~/.julia/packages/DifferentiationInterface/FTGtS/src/first_order/pushforward.jl:120
  [2] prepare_pushforward(f::OptimizationBase.var"#_f#22"{}, backend::SecondOrder{…}, x::Vector{…}, tx::DifferentiationInterface.Tangents{…})
    @ DifferentiationInterface ~/.julia/packages/DifferentiationInterface/FTGtS/src/first_order/pushforward.jl:93
  [3] _prepare_pullback_aux(f::OptimizationBase.var"#_f#22"{}, backend::SecondOrder{…}, x::Vector{…}, ty::DifferentiationInterface.Tangents{…}, ::DifferentiationInterface.PullbackSlow)
    @ DifferentiationInterface ~/.julia/packages/DifferentiationInterface/FTGtS/src/first_order/pullback.jl:104
  [4] prepare_pullback(f::OptimizationBase.var"#_f#22"{}, backend::SecondOrder{…}, x::Vector{…}, ty::DifferentiationInterface.Tangents{…})
    @ DifferentiationInterface ~/.julia/packages/DifferentiationInterface/FTGtS/src/first_order/pullback.jl:93
  [5] prepare_gradient(f::OptimizationBase.var"#_f#22"{}, backend::SecondOrder{…}, x::Vector{…})
    @ DifferentiationInterface ~/.julia/packages/DifferentiationInterface/FTGtS/src/first_order/gradient.jl:56
  [6] instantiate_function(f::OptimizationFunction{…}, x::Vector{…}, adtype::SecondOrder{…}, p::Tuple{…}, num_cons::Int64; g::Bool, h::Bool, hv::Bool, fg::Bool, fgh::Bool, cons_j::Bool, cons_vjp::Bool, cons_jvp::Bool, cons_h::Bool, lag_h::Bool)
    @ OptimizationBase ~/.julia/packages/OptimizationBase/3r1wm/src/OptimizationDIExt.jl:42
  [7] instantiate_function
    @ ~/.julia/packages/OptimizationBase/3r1wm/src/OptimizationDIExt.jl:28 [inlined]
  [8] #instantiate_function#48
    @ ~/.julia/packages/OptimizationBase/3r1wm/src/OptimizationDIExt.jl:294 [inlined]
  [9] instantiate_function
    @ ~/.julia/packages/OptimizationBase/3r1wm/src/OptimizationDIExt.jl:287 [inlined]
 [10] OptimizationMOI.MOIOptimizationNLPCache(prob::OptimizationProblem{…}, opt::Ipopt.Optimizer; mtkize::Bool, callback::Nothing, kwargs::@Kwargs{})
    @ OptimizationMOI ~/.julia/packages/OptimizationMOI/tpWKG/src/nlp.jl:122
 [11] MOIOptimizationNLPCache
    @ ~/.julia/packages/OptimizationMOI/tpWKG/src/nlp.jl:108 [inlined]
 [12] #__init#42
    @ ~/.julia/packages/OptimizationMOI/tpWKG/src/OptimizationMOI.jl:302 [inlined]
 [13] __init
    @ ~/.julia/packages/OptimizationMOI/tpWKG/src/OptimizationMOI.jl:293 [inlined]
 [14] #init#657
    @ ~/.julia/packages/SciMLBase/SQnVC/src/solve.jl:174 [inlined]
 [15] init
    @ ~/.julia/packages/SciMLBase/SQnVC/src/solve.jl:172 [inlined]
 [16] solve(::OptimizationProblem{…}, ::Ipopt.Optimizer; kwargs::@Kwargs{})
    @ SciMLBase ~/.julia/packages/SciMLBase/SQnVC/src/solve.jl:96
 [17] solve(::OptimizationProblem{…}, ::Ipopt.Optimizer)
    @ SciMLBase ~/.julia/packages/SciMLBase/SQnVC/src/solve.jl:93
 [18] top-level scope
    @ REPL[4]:1
Some type information was truncated. Use `show(err)` to see complete types.

EDIT: @gdalle actually, it asks me to open an issue at DifferentiationInterface.jl, is this combination Forward over Forward supposed to work?

@gdalle
Copy link
Contributor

gdalle commented Sep 28, 2024

Yeah this combination works, you can disregard the message in this case. It's probably due to OptimizationBase not having a special case for passing SecondOrder. We can discuss it together @Vaibhavdixit02 if you like.

@Vaibhavdixit02
Copy link
Member

I'll handle explicit second order and conditional checking of loaded modules before choosing Reverse mode in #110

@baggepinnen baggepinnen deleted the patch-1 branch September 29, 2024 16:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants