diff --git a/README.md b/README.md index fe37d0b..6b1fa83 100644 --- a/README.md +++ b/README.md @@ -50,7 +50,7 @@ It is possible to change the current value of a parameter with the function: ```julia -ParameterJuMP.setvalue!(p::Parameter, new_value::Number) +fix(p::Parameter, new_value::Number) ``` Finally, the `dual` function of JuMP is overloaded to return duals @@ -102,7 +102,7 @@ optimize!(model) dual(a) # modify the value of the parameter a to 20 -ParameterJuMP.setvalue!(a, 20) +fix(a, 20) # solve the model with the new value of the parameter optimize!(model) @@ -214,7 +214,7 @@ optimize!(model_pure) y_duals = dual.(y) # modify y -ParameterJuMP.setvalue.(y, new_value_for_y) +fix.(y, new_value_for_y) # solve problem (again) optimize!(model_pure) diff --git a/REQUIRE b/REQUIRE index 2cafaf4..7a0d0e9 100644 --- a/REQUIRE +++ b/REQUIRE @@ -1,3 +1,4 @@ julia 1.0 JuMP 0.19 0.20 MathOptInterface 0.8.4 +SparseArrays diff --git a/examples/benders_quatile_regression.jl b/examples/benders_quatile_regression.jl index 68ffa2d..f6e4501 100644 --- a/examples/benders_quatile_regression.jl +++ b/examples/benders_quatile_regression.jl @@ -42,12 +42,12 @@ using JuMP #' such as Clp, Xpress, Gurobi, CPLEX and so on) #+ echo = false -# using GLPK -# const OPTIMIZER = GLPK.Optimizer; +# using Xpress +# const OPTIMIZER = Xpress.Optimizer; #+ -using Xpress -const OPTIMIZER = Xpress.Optimizer; +using GLPK +const OPTIMIZER = GLPK.Optimizer; #' TimerOutputs: a time measuring library to demonstrate the advantage of using ParameterJuMP using TimerOutputs @@ -476,7 +476,7 @@ function slave_solve(PARAM, model, master_solution) # *parameters* can be set to new values and the optimization # model will be automatically updated β = model[2] - ParameterJuMP.setvalue!.(β, β0) + fix.(β, β0) else # JuMP also has the hability to fix variables to new values β_fixed = model[3] diff --git a/examples/html/benders_quatile_regression.html b/examples/html/benders_quatile_regression.html index 504dc6f..f7b72be 100644 --- a/examples/html/benders_quatile_regression.html +++ b/examples/html/benders_quatile_regression.html @@ -694,8 +694,8 @@

Introduction

-using Xpress
-const OPTIMIZER = Xpress.Optimizer;
+using GLPK
+const OPTIMIZER = GLPK.Optimizer;
 
@@ -737,7 +737,7 @@

Norm-1 regression

\]

where $\varepsilon$ is some random error.

The estimation of the $\beta$ values relies on observations of the variables: $\{y^i, x_1^i, \dots, x_n^i\}_i$

-

In this notebook we will solver a problem where the explanatory variables are sinusoids of differents frequencies

+

In this notebook we will solve a problem where the explanatory variables are sinusoids of differents frequencies

First, we define the number of explanatory variables and observations

@@ -895,16 +895,16 @@

Norm-1 regression

 First coefficients in solution: [3.76377, 0.790012, 0.878022, 0.0563155, 0.
 0876301, 0.314878, 0.00317523, 0.148887, 0.120253, 0.242875]
-Objective value: 42.62768238530737
-Time in solve: 0.762704413
-Time in build: 0.051016511
+Objective value: 42.627682385307324
+Time in solve: 10.853783266
+Time in build: 2.185323159
 

Benders decompositon

Benders decompostions is used to solve large optimization problems with some special characteristics. LP's can be solved with classical linear optimization methods such as the Simplex method or Interior point methods provided by solvers like GLPK. However, these methods do not scale linearly with the problem size. In the Benders decomposition framework we break the problem in two pieces: A master and a slave problem.

-

Of course some variables will belong to both problems, this is where the cleverness of Benders kicks in: The master problem is solved and passes the shared variables to the slave. The slave problem is solved with the shared variables FIXED to the values given by the master problem. The solution of the slave problem can be used to generate a constraint to the master problem to describe the linear approximation of the cost function of the shared variables. In many cases, like stochastic programming, the slave problems have a interestig structure and might be broken in smaller problem to be solved in parallel.

+

Of course some variables will belong to both problems, this is where the cleverness of Benders kicks in: The master problem is solved and passes the shared variables to the slave. The slave problem is solved with the shared variables FIXED to the values given by the master problem. The solution of the slave problem can be used to generate a constraint to the master problem to describe the linear approximation of the cost function of the shared variables. In many cases, like stochastic programming, the slave problems have a interesting structure and might be broken in smaller problem to be solved in parallel.

We will descibe the decomposition similarly to what is done in: Introduction to Linear Optimization, Bertsimas & Tsitsiklis (Chapter 6.5): Where the problem in question has the form

\[ @@ -938,7 +938,7 @@

Slave

& && {\varepsilon^{up}}_i, {\varepsilon^{dw}}_i \geq 0 && \forall i \in ObsSet(k) \notag \\ \end{align} \]

-

The collection $ObsSet(k)$ is a sub-set of the NObservations. Any partition of the NObservations collection is valid. In this notebook we will parition with the function:

+

The collection $ObsSet(k)$ is a sub-set of the NObservations. Any partition of the NObservations collection is valid. In this notebook we will partition with the function:

@@ -1026,7 +1026,7 @@ 

Slave

Master

-

Now that all pieces of the original problem can be representad by the convex $z_k(x)$ functions we can recast the problem inthe the equivalent form:

+

Now that all pieces of the original problem can be representad by the convex $z_k(x)$ functions we can recast the problem in the the equivalent form:

\[ \begin{align} & \min_{x} && c^T x + z_1(x) + \dots + z_n(x) && \notag \\ @@ -1062,7 +1062,6 @@

Master

\end{align} \]

But now its only an underestimated problem. In the case of our problem it can be written as:

-

It is possible to rewrite the above problem

\[ \begin{align} & \min_{\varepsilon, \beta} && \sum_{i \in Nodes} \varepsilon_i \notag \\ @@ -1144,7 +1143,7 @@

Supporting Hyperplanes

# *parameters* can be set to new values and the optimization # model will be automatically updated β = model[2] - ParameterJuMP.setvalue!.(β, β0) + fix.(β, β0) else # JuMP also has the hability to fix variables to new values β_fixed = model[3] @@ -1164,7 +1163,7 @@

Supporting Hyperplanes

π = dual.(β) else # or, in pure JuMP, we query the duals form - # constraints tha fix the values of our regression + # constraints that fix the values of our regression # coefficients π = dual.(β_fix) end @@ -1218,7 +1217,7 @@

Algorithm wrap up

  • Solve the slave problem

  • -
  • query the solution of the slave problem to obtiain the supporting hyperplane

    +
  • query the solution of the slave problem to obtain the supporting hyperplane

  • add hyperplane to master problem

  • @@ -1226,7 +1225,7 @@

    Algorithm wrap up

    -

    Now we grab all the pieces tha we built and we writeh the benders algorithm by calling the above function in a proper order.

    +

    Now we grab all the pieces that we built and we write the benders algorithm by calling the above function in a proper order.

    The macros @timeit are use to time each step of the algorithm.

    @@ -1323,41 +1322,41 @@

    Algorithm wrap up

    Build slave problems Build initial cuts Initialize Iterative step -Iter = 1, LB = 0.0, UB = 736.9397549915154 -Iter = 2, LB = 0.0, UB = 736.9397549915154 -Iter = 3, LB = 16.349215016228914, UB = 91.35425023118906 -Iter = 4, LB = 28.27109871462413, UB = 55.078259122158684 -Iter = 5, LB = 36.35333651603361, UB = 50.10700424349081 -Iter = 6, LB = 40.587767838387414, UB = 45.263955986446454 -Iter = 7, LB = 42.040356808876794, UB = 44.03470247234688 +Iter = 1, LB = 0.0, UB = 243.75840787493047 +Iter = 2, LB = 0.0, UB = 243.75840787493047 +Iter = 3, LB = 5.38703942212879, UB = 243.75840787493047 +Iter = 4, LB = 23.450565692848926, UB = 57.99604186311867 +Iter = 5, LB = 33.35659250578711, UB = 49.12208051558014 +Iter = 6, LB = 40.13581171936486, UB = 46.13064724732366 +Iter = 7, LB = 41.97883975859214, UB = 43.66751463014951 Converged! -First coefficients in solution: [3.76267, 0.792786, 0.873536, 0.0558342, 0. -0914146, 0.312128, -0.00148945, 0.155308, 0.121462, 0.236182] -Objective value: 42.040356808876794 -Time in loop: 1.212996736 -Time in init: 0.839133386 +First coefficients in solution: [3.76754, 0.790079, 0.877002, 0.0552698, 0. +0869099, 0.314411, 0.00889972, 0.152491, 0.118625, 0.23782] +Objective value: 41.97883975859214 +Time in loop: 1.173810276 +Time in init: 4.767122531 ───────────────────────────────────────────────────────────────────────── Time Allocations ────────────────────── ─────────────────────── - Tot / % measured: 2.06s / 100% 261MiB / 100% + Tot / % measured: 6.75s / 88.0% 758MiB / 97.1% Section ncalls time %tot avg alloc %tot avg ───────────────────────────────────────────────────────────────────────── - Loop 1 1.21s 59.1% 1.21s 114MiB 43.8% 114MiB - solve nodes 7 821ms 40.0% 117ms 96.6MiB 37.0% 13.8MiB - opt 700 485ms 23.7% 693μs 2.49MiB 0.95% 3.64KiB - fix 700 208ms 10.1% 297μs 73.8MiB 28.3% 108KiB - dual 700 119ms 5.78% 170μs 20.0MiB 7.66% 29.3KiB - solve master 7 293ms 14.3% 41.8ms 1.55MiB 0.60% 227KiB - add cuts 7 71.8ms 3.50% 10.3ms 14.7MiB 5.63% 2.10MiB - Init 1 839ms 40.9% 839ms 147MiB 56.2% 147MiB - Cuts 1 632ms 30.8% 632ms 101MiB 38.9% 101MiB - opt 100 520ms 25.3% 5.20ms 93.3MiB 35.7% 955KiB - dual 100 16.9ms 0.82% 169μs 2.86MiB 1.09% 29.3KiB - fix 100 1.59ms 0.08% 15.9μs 633KiB 0.24% 6.33KiB - Slaves 1 206ms 10.0% 206ms 44.9MiB 17.2% 44.9MiB - Master 1 671μs 0.03% 671μs 148KiB 0.06% 148KiB + Init 1 4.77s 80.2% 4.77s 621MiB 84.3% 621MiB + Cuts 1 4.09s 68.8% 4.09s 530MiB 72.0% 530MiB + opt 100 2.07s 34.9% 20.7ms 264MiB 35.9% 2.64MiB + dual 100 312ms 5.25% 3.12ms 26.8MiB 3.64% 274KiB + fix 100 97.3ms 1.64% 973μs 4.13MiB 0.56% 42.3KiB + Slaves 1 610ms 10.3% 610ms 79.0MiB 10.7% 79.0MiB + Master 1 296μs 0.00% 296μs 147KiB 0.02% 147KiB Sol 1 855ns 0.00% 855ns 928B 0.00% 928B + Loop 1 1.17s 19.8% 1.17s 116MiB 15.7% 116MiB + solve master 7 462ms 7.77% 65.9ms 1.36MiB 0.18% 199KiB + solve nodes 7 442ms 7.44% 63.1ms 85.0MiB 11.5% 12.1MiB + fix 700 214ms 3.59% 305μs 64.6MiB 8.77% 94.5KiB + opt 700 112ms 1.89% 161μs 153KiB 0.02% 224B + dual 700 108ms 1.82% 154μs 20.0MiB 2.71% 29.3KiB + add cuts 7 219ms 3.68% 31.3ms 27.8MiB 3.77% 3.97MiB ─────────────────────────────────────────────────────────────────────────
    @@ -1378,41 +1377,41 @@

    Algorithm wrap up

    Build slave problems Build initial cuts Initialize Iterative step -Iter = 1, LB = 0.0, UB = 736.9397549915191 -Iter = 2, LB = 0.0, UB = 736.9397549915191 -Iter = 3, LB = 16.34921501622873, UB = 91.35425023118692 -Iter = 4, LB = 28.271098714624266, UB = 55.078259122159 -Iter = 5, LB = 36.353336516033494, UB = 50.107004243491104 -Iter = 6, LB = 40.5877678383875, UB = 45.263955986447975 -Iter = 7, LB = 42.04035680887677, UB = 44.03470247234711 +Iter = 1, LB = 0.0, UB = 243.75840787492817 +Iter = 2, LB = 3.776535560004161e-14, UB = 243.75840787492817 +Iter = 3, LB = 5.38703942212852, UB = 243.75840787492817 +Iter = 4, LB = 23.45056569284757, UB = 57.996041863123985 +Iter = 5, LB = 33.356592505784526, UB = 49.12208051557272 +Iter = 6, LB = 40.13581171938589, UB = 46.13064724725085 +Iter = 7, LB = 41.97883975858674, UB = 43.667514630132594 Converged! -First coefficients in solution: [3.76267, 0.792786, 0.873536, 0.0558342, 0. -0914146, 0.312128, -0.00148945, 0.155308, 0.121462, 0.236182] -Objective value: 42.04035680887677 -Time in loop: 0.688577073 -Time in init: 0.355588428 +First coefficients in solution: [3.76754, 0.790079, 0.877002, 0.0552698, 0. +0869099, 0.314411, 0.00889972, 0.152491, 0.118625, 0.23782] +Objective value: 41.97883975858674 +Time in loop: 0.66281678 +Time in init: 1.524861044 ───────────────────────────────────────────────────────────────────────── Time Allocations ────────────────────── ─────────────────────── - Tot / % measured: 1.04s / 100% 64.6MiB / 100% + Tot / % measured: 2.19s / 100% 183MiB / 100% Section ncalls time %tot avg alloc %tot avg ───────────────────────────────────────────────────────────────────────── - Loop 1 689ms 65.9% 689ms 26.3MiB 40.7% 26.3MiB - solve nodes 7 353ms 33.8% 50.4ms 10.4MiB 16.1% 1.49MiB - opt 700 332ms 31.8% 475μs 8.34MiB 12.9% 12.2KiB - dual 700 10.7ms 1.03% 15.3μs 1.09MiB 1.69% 1.59KiB - fix 700 3.53ms 0.34% 5.05μs 613KiB 0.93% 896B - solve master 7 278ms 26.6% 39.7ms 1.55MiB 2.40% 227KiB - add cuts 7 56.7ms 5.43% 8.10ms 14.3MiB 22.2% 2.05MiB - Init 1 356ms 34.1% 356ms 38.3MiB 59.3% 38.3MiB - Cuts 1 268ms 25.7% 268ms 17.4MiB 26.9% 17.4MiB - opt 100 156ms 15.0% 1.56ms 11.4MiB 17.7% 117KiB - dual 100 2.66ms 0.25% 26.6μs 159KiB 0.24% 1.59KiB - fix 100 465μs 0.04% 4.65μs 87.5KiB 0.13% 896B - Slaves 1 85.7ms 8.21% 85.7ms 20.7MiB 32.1% 20.7MiB - Master 1 1.16ms 0.11% 1.16ms 148KiB 0.22% 148KiB - Sol 1 428ns 0.00% 428ns 928B 0.00% 928B + Init 1 1.52s 69.7% 1.52s 157MiB 86.1% 157MiB + Cuts 1 956ms 43.7% 956ms 95.7MiB 52.3% 95.7MiB + opt 100 456ms 20.9% 4.56ms 45.8MiB 25.1% 469KiB + dual 100 130ms 5.95% 1.30ms 14.6MiB 7.99% 150KiB + fix 100 430μs 0.02% 4.30μs 7.81KiB 0.00% 80.0B + Slaves 1 568ms 26.0% 568ms 61.5MiB 33.6% 61.5MiB + Master 1 275μs 0.01% 275μs 147KiB 0.08% 147KiB + Sol 1 856ns 0.00% 856ns 928B 0.00% 928B + Loop 1 663ms 30.3% 663ms 25.5MiB 13.9% 25.5MiB + solve master 7 400ms 18.3% 57.1ms 1.36MiB 0.74% 199KiB + solve nodes 7 204ms 9.31% 29.1ms 10.0MiB 5.48% 1.43MiB + opt 700 187ms 8.54% 267μs 8.70MiB 4.76% 12.7KiB + dual 700 5.76ms 0.26% 8.24μs 1.09MiB 0.60% 1.59KiB + fix 700 3.01ms 0.14% 4.30μs 54.7KiB 0.03% 80.0B + add cuts 7 53.5ms 2.45% 7.64ms 14.1MiB 7.70% 2.01MiB ───────────────────────────────────────────────────────────────────────── @@ -1431,7 +1430,7 @@

    Algorithm wrap up

    - +

    Acknowledgments

    @@ -1444,7 +1443,7 @@

    Acknowledgments

    diff --git a/src/ParameterJuMP.jl b/src/ParameterJuMP.jl index 85f2561..a1243fb 100644 --- a/src/ParameterJuMP.jl +++ b/src/ParameterJuMP.jl @@ -1,13 +1,14 @@ module ParameterJuMP +using SparseArrays + +using JuMP using MathOptInterface const MOI = MathOptInterface const MOIU = MOI.Utilities -using JuMP - export -ModelWithParams, Parameter, Parameters, setvalue! +ModelWithParams, Parameter, Parameters # types # ------------------------------------------------------------------------------ @@ -16,7 +17,6 @@ struct Parameter <: JuMP.AbstractJuMPScalar ind::Int64 # local reference model::JuMP.Model end -JuMP.function_string(mode, ::Parameter) = "_param_" # Reference to a constraint in which the parameter has coefficient coef struct ParametrizedConstraintRef{C} @@ -47,12 +47,18 @@ mutable struct ParameterData parameters_map_saf_in_le::Dict{CtrRef{SAF, LE}, JuMP.GenericAffExpr{Float64,Parameter}} parameters_map_saf_in_ge::Dict{CtrRef{SAF, GE}, JuMP.GenericAffExpr{Float64,Parameter}} + names::Dict{Parameter, String} + # overload addvariable # variables_map::Dict{Int64, Vector{Int64}} # variables_map_lb::Dict{Int64, Vector{Int64}} # variables_map_ub::Dict{Int64, Vector{Int64}} - solved::Bool + # solved::Bool + + has_deletion::Bool + index_map::Dict{Int64, Int64} + lazy::Bool @@ -68,7 +74,10 @@ mutable struct ParameterData Dict{CtrRef{SAF, EQ}, JuMP.GenericAffExpr{Float64,Parameter}}(), Dict{CtrRef{SAF, LE}, JuMP.GenericAffExpr{Float64,Parameter}}(), Dict{CtrRef{SAF, GE}, JuMP.GenericAffExpr{Float64,Parameter}}(), + Dict{Parameter, String}(), + # false, false, + Dict{Int64, Int64}(), false, false, Float64[], @@ -76,6 +85,21 @@ mutable struct ParameterData end end +""" + index(p::Parameter)::Int64 + +Return the internal index of the parameter `p`. +""" +index(p::Parameter) = index(_getparamdata(p.model), p)::Int64 +index(model::JuMP.Model, p::Parameter) = index(_getparamdata(model), p)::Int64 +function index(data::ParameterData, p::Parameter)::Int64 + if data.has_deletion + return data.index_map[p.ind] + else + return p.ind + end +end + no_duals(data::ParameterData) = data.no_duals no_duals(model::JuMP.Model) = no_duals(_getparamdata(model)) @@ -211,123 +235,63 @@ function Parameters(model::JuMP.Model, val::AbstractArray{R,N}) where {R,N} return out end -# getters/setters +# solve # ------------------------------------------------------------------------------ -function JuMP.value(p::Parameter) - params = _getparamdata(p)::ParameterData - params.future_values[p.ind] -end - """ - setvalue!(p::Parameter, val::Real)::Nothing - -Sets the parameter `p` to the new value `val`. -""" -function setvalue!(p::Parameter, val::Real) - params = _getparamdata(p)::ParameterData - params.sync = false - params.future_values[p.ind] = val - return nothing -end -function JuMP.dual(p::Parameter) - params = _getparamdata(p)::ParameterData - if no_duals(params) - error("No dual query mode is activated for this model. If you plan to query duals you cannot call `set_no_duals`") - end - if lazy_duals(params) - return _getdual(p) - else - return params.dual_values[p.ind] - end -end - -function _getdual(p::Parameter) - return _getdual(p.model, p.ind) -end -function _getdual(pcr::ParametrizedConstraintRef)::Float64 - pcr.coef * JuMP.dual(pcr.cref) -end -function _getdual(model::JuMP.Model, ind::Integer) - params = _getparamdata(model)::ParameterData - # See (12) in http://www.juliaopt.org/MathOptInterface.jl/stable/apimanual.html#Duals-1 - # in the dual objective: - sum_i b_i^T y_i - # Here b_i depends on the param and is b_i' + coef * param - # so the dualobjective is: - # - sum_i b_i'^T y_i - param * sum_i coef^T y_i - # The dual of the parameter is: - sum_i coef^T y_i - return -sum(_getdual.(params.constraints_map[ind])) -end + ModelWithParams(args...; kwargs...)::JuMP.Model -# type 1 -# ------------------------------------------------------------------------------ +Returns a JuMP.Model able to handle `Parameters`. -const GAE{C,V} = JuMP.GenericAffExpr{C,V} -const GAEv{C} = JuMP.GenericAffExpr{C,JuMP.VariableRef} -const GAEp{C} = JuMP.GenericAffExpr{C,Parameter} +`args` and `kwargs` are the same parameters that would be passed +to the regular `Model` constructor. -mutable struct ParametrizedAffExpr{C} <: JuMP.AbstractJuMPScalar - v::JuMP.GenericAffExpr{C,JuMP.VariableRef} - p::JuMP.GenericAffExpr{C,Parameter} -end +Example using GLPK solver: -if VERSION >= v"0.7-" - Base.broadcastable(expr::ParametrizedAffExpr) = Ref(expr) +```julia + model = ModelWithParams(with_optimizer(GLPK.Optimizer)) +``` +""" +function ModelWithParams(args...; kwargs...) + m = JuMP.Model(args...; kwargs...) + enable_parameters(m) + return m end -# JuMP.GenericAffExpr{C,Parameter}(params::Vector{Parameter},coefs::Vector{C}) = JuMP.GenericAffExpr{C}(params,coefs,C(0.0)) - -const PAE{C} = ParametrizedAffExpr{C} -const PAEC{S} = JuMP.ScalarConstraint{PAE{Float64}, S} -const PVAEC{S} = JuMP.VectorConstraint{PAE{Float64}, S} - - -# Build constraint -# ------------------------------------------------------------------------------ - -# TODO should be in MOI, MOIU or JuMP -_shift_constant(set::S, offset) where S <: Union{MOI.LessThan,MOI.GreaterThan,MOI.EqualTo} = S(MOIU.getconstant(set) + offset) -function _shift_constant(set::MOI.Interval, offset) - MOI.Interval(set.lower + offset, set.upper + offset) -end +""" + enable_parameters(m::JuMP.Model)::Nothing -function JuMP.build_constraint(_error::Function, aff::ParametrizedAffExpr, set::S) where S <: Union{MOI.LessThan,MOI.GreaterThan,MOI.EqualTo} - offset = aff.v.constant - aff.v.constant = 0.0 - shifted_set = _shift_constant(set, -offset) - return JuMP.ScalarConstraint(aff, shifted_set) +Enables a `JuMP.Model` to handle `Parameters`. +""" +function enable_parameters(m::JuMP.Model) + if haskey(m.ext, :params) + error("Model already has parameter enabled") + end + # test and compare hook + initialize_parameter_data(m) + JuMP.set_optimize_hook(m, parameter_optimizehook) + return nothing end -function JuMP.build_constraint(_error::Function, aff::ParametrizedAffExpr, lb, ub) - JuMP.build_constraint(_error, aff, MOI.Interval(lb, ub)) +function initialize_parameter_data(m::JuMP.Model) + m.ext[:params] = ParameterData() end -function JuMP.add_constraint(m::JuMP.Model, c::PAEC{S}, name::String="") where S - - # build LinearConstraint - c_lin = JuMP.ScalarConstraint(c.func.v, c.set) - - # JuMP´s standard add_constrint - cref = JuMP.add_constraint(m, c_lin, name) - +function parameter_optimizehook(m::JuMP.Model; kwargs...) data = _getparamdata(m)::ParameterData - # needed for lazy get dual - if lazy_duals(data) - for (param, coef) in c.func.p.terms - push!(data.constraints_map[param.ind], ParametrizedConstraintRef(cref, coef)) - end - end + # sync model rhs to newest parameter values + sync(data) - # save the parameter part of a parametric affine expression - _get_param_dict(data, S)[cref] = c.func.p + ret = JuMP.optimize!(m::JuMP.Model, ignore_optimize_hook = true, kwargs...) - return cref + # update duals for later query + if !no_duals(data) && JuMP.has_duals(m) + _update_duals(data) + end + return ret end -# solve -# ------------------------------------------------------------------------------ - """ sync(model::JuMP.Model)::Nothing @@ -336,9 +300,7 @@ Parameters. """ function sync(data::ParameterData) if !is_sync(data) - _update_constraints(data, EQ) - _update_constraints(data, LE) - _update_constraints(data, GE) + _update_constraints(data) data.current_values .= data.future_values data.sync = true end @@ -348,210 +310,18 @@ function sync(model::JuMP.Model) sync(_getparamdata(model)) end -function _update_constraints(data::ParameterData, ::Type{S}) where S - for (cref, gaep) in _get_param_dict(data, S) - _update_constraint(data, cref, gaep) - end - return nothing -end - -function _update_constraint(data::ParameterData, cref, gaep::GAEp{C}) where C - val = 0.0 - @inbounds for (param, coef) in gaep.terms - val += coef * (data.future_values[param.ind] - data.current_values[param.ind]) - end - _update_constraint(data, cref, val) - return nothing -end - -function _update_constraint(data::ParameterData, cref, val::Number) - if !iszero(val) - ci = JuMP.index(cref) - old_set = MOI.get(cref.model.moi_backend, MOI.ConstraintSet(), ci) - # For scalar constraints, the constant in the function is zero and the - # constant is stored in the set. Since `pcr.coef` corresponds to the - # coefficient in the function, we need to take `-pcr.coef`. - new_set = _shift_constant(old_set, -val) - MOI.set(cref.model.moi_backend, MOI.ConstraintSet(), ci, new_set) - end - return nothing -end - - -function _param_optimizehook(m::JuMP.Model; kwargs...) - data = _getparamdata(m)::ParameterData - - # sync model rhs to newest parameter values - sync(data) - - ret = JuMP.optimize!(m::JuMP.Model, ignore_optimize_hook = true, kwargs...) - data.solved = true - - if !lazy_duals(data) && JuMP.has_duals(m) && !no_duals(data) - fill!(data.dual_values, 0.0) - _update_duals(data, EQ) - _update_duals(data, GE) - _update_duals(data, LE) - end - return ret -end - -function _update_duals(data, ::Type{S}) where S - for (cref, gaep) in _get_param_dict(data, S) - _update_duals(data, cref, gaep) - end - return nothing -end - -function _update_duals(data, cref, gaep) - dual_sol = JuMP.dual(cref) - @inbounds for (param, coef) in gaep.terms - data.dual_values[param.ind] -= coef * dual_sol - end - return nothing -end - -""" - ModelWithParams(args...; kwargs...)::JuMP.Model - -Returns a JuMP.Model able to handle `Parameters`. - -`args` and `kwargs` are the same parameters that would be passed -to the regular `Model` constructor. - -Example using GLPK solver: - -```julia - model = ModelWithParams(with_optimizer(GLPK.Optimizer)) -``` -""" -function ModelWithParams(args...; kwargs...) - - m = JuMP.Model(args...; kwargs...) - - JuMP.set_optimize_hook(m, _param_optimizehook) - - m.ext[:params] = ParameterData() - - return m -end - -function JuMP.set_coefficient(con::CtrRef{F, S}, param::Parameter, coef::Number) where {F<:SAF, S} - data = _getparamdata(param) - dict = _get_param_dict(data, S) - old_coef = 0.0 - if haskey(dict, con) - gaep = dict[con] - old_coef = get!(gaep.terms, param, 0.0) - gaep.terms[param] = coef - else - # TODO fix type C - dict[con] = GAEp{Float64}(zero(Float64), param => coef) - end - if !iszero(coef-old_coef) - val = (coef-old_coef)*data.current_values[param.ind] - _update_constraint(data, con, val) - if !iszero(data.future_values[param.ind] - data.current_values[param.ind]) - data.sync = false - end - end - if lazy_duals(data) - ctr_map = data.constraints_map[param.ind] - found_ctr = false - if isempty(ctr_map) - push!(ctr_map, ParametrizedConstraintRef(con, coef)) - else - for (index, pctr) in enumerate(ctr_map) - if pctr.cref == con - if found_ctr - deleteat!(ctr_map, index) - else - ctr_map[index] = ParametrizedConstraintRef(con, coef) - end - found_ctr = true - end - end - end - if found_ctr && !iszero(data.future_values[param.ind]) - data.sync = false - end - end - nothing -end - -""" - delete_from_constraint(con, param::Parameter) - -Removes parameter `param` from constraint `con`. -""" -function delete_from_constraint(con::CtrRef{F, S}, param::Parameter) where {F, S} - data = _getparamdata(param) - dict = _get_param_dict(data, S) - if haskey(dict, con) - old_coef = get!(dict[con].terms, param, 0.0) - _update_constraint(data, con, (0.0-old_coef) * data.current_values[param.ind]) - delete!(dict[con].terms, param) - if !iszero(data.future_values[param.ind] - data.current_values[param.ind]) - data.sync = false - end - end - if lazy_duals(data) - ctr_map = data.constraints_map[param.ind] - found_ctr = false - for (index, pctr) in enumerate(ctr_map) - if pctr.cref == con - deleteat!(ctr_map, index) - found_ctr = true - end - end - if found_ctr && !iszero(data.future_values[param.ind]) - data.sync = false - end - end - nothing -end - -""" - delete_from_constraints(param::Parameter) - -Removes parameter `param` from all constraints. -""" -function delete_from_constraints(::Type{S}, param::Parameter) where S - data = _getparamdata(param) - dict = _get_param_dict(data, S) - for (con, gaep) in dict - if haskey(gaep.terms, param) - if !iszero(gaep.terms[param]) && !iszero(data.future_values[param.ind]) - data.sync = false - end - old_coef = get!(dict[con].terms, param, 0.0) - _update_constraint(data, con, (0.0-old_coef) * data.current_values[param.ind]) - delete!(gaep.terms, param) - end - end - if lazy_duals(data) - if !isempty(data.constraints_map[param.ind]) - data.constraints_map[param.ind] = ParametrizedConstraintRef[] - if !iszero(data.future_values[param.ind]) - data.sync = false - end - end - end - nothing -end +# constraints +# ------------------------------------------------------------------------------ +include("constraints.jl") -function delete_from_constraints(param::Parameter) - delete_from_constraints(EQ, param) - delete_from_constraints(LE, param) - delete_from_constraints(GE, param) - # TODO lazy - nothing -end +# JuMP variable interface +# ------------------------------------------------------------------------------ +include("variable_interface.jl") # operators and mutable arithmetics # ------------------------------------------------------------------------------ - include("operators.jl") include("mutable_arithmetics.jl") include("print.jl") + end diff --git a/src/constraints.jl b/src/constraints.jl new file mode 100644 index 0000000..6bc9177 --- /dev/null +++ b/src/constraints.jl @@ -0,0 +1,380 @@ +# type 1 +# ------------------------------------------------------------------------------ + +const GAE{C,V} = JuMP.GenericAffExpr{C,V} +const GAEv{C} = JuMP.GenericAffExpr{C,JuMP.VariableRef} +const GAEp{C} = JuMP.GenericAffExpr{C,Parameter} + +mutable struct ParametrizedAffExpr{C} <: JuMP.AbstractJuMPScalar + v::JuMP.GenericAffExpr{C,JuMP.VariableRef} + p::JuMP.GenericAffExpr{C,Parameter} +end + +const PAE{C} = ParametrizedAffExpr{C} +const PAEC{S} = JuMP.ScalarConstraint{PAE{Float64}, S} +const PVAEC{S} = JuMP.VectorConstraint{PAE{Float64}, S} + +Base.iszero(a::ParametrizedAffExpr) = iszero(a.v) && iszero(a.p) +Base.zero(::Type{ParametrizedAffExpr{C}}) where {C} = PAE{C}(zero(GAEv{C}), zero(GAEp{C})) +Base.one(::Type{ParametrizedAffExpr{C}}) where {C} = PAE{C}(one(GAEv{C}), zero(GAEp{C})) +Base.zero(a::ParametrizedAffExpr) = zero(typeof(a)) +Base.one( a::ParametrizedAffExpr) = one(typeof(a)) +Base.copy(a::ParametrizedAffExpr{C}) where C = PAE{C}(copy(a.v), copy(a.p)) +Base.broadcastable(expr::ParametrizedAffExpr) = Ref(expr) +# JuMP.GenericAffExpr{C,Parameter}(params::Vector{Parameter},coefs::Vector{C}) = JuMP.GenericAffExpr{C}(params,coefs,C(0.0)) + +ParametrizedAffExpr{C}() where {C} = zero(ParametrizedAffExpr{C}) + +function JuMP.map_coefficients_inplace!(f::Function, a::ParametrizedAffExpr) + map_coefficients_inplace!(f, a.v) + # The iterator remains valid if existing elements are updated. + for (coef, par) in linear_terms(a.p) + a.p.terms[par] = f(coef) + end + return a +end + +function JuMP.map_coefficients(f::Function, a::ParametrizedAffExpr) + return JuMP.map_coefficients_inplace!(f, copy(a)) +end + +function Base.sizehint!(a::ParametrizedAffExpr, n::Int, m::Int) + sizehint!(a.v.terms, n) + sizehint!(a.p.terms, m) +end + +function JuMP.value(ex::ParametrizedAffExpr{T}, var_value::Function) where {T} + ret = value(ex.v, var_value) + for (param, coef) in ex.p.terms + ret += coef * var_value(param) + end + ret +end + +JuMP.constant(aff::ParametrizedAffExpr) = aff.v.constant + +# iterator + +# add to expression + +function Base.isequal(aff::ParametrizedAffExpr{C}, + other::ParametrizedAffExpr{C}) where {C} + return isequal(aff.v, other.v) && +isequal(aff.p, other.p) +end + +Base.hash(aff::ParametrizedAffExpr, h::UInt) = hash(aff.v.constant, hash(aff.v.terms, h), hash(aff.p.terms, h)) + +function SparseArrays.dropzeros(aff::ParametrizedAffExpr{C}) where C + v = SparseArrays.dropzeros(aff.v) + p = SparseArrays.dropzeros(aff.p) + return PAE{C}(v,p) +end + +# Check if two AffExprs are equal after dropping zeros and disregarding the +# order. Mostly useful for testing. +function JuMP.isequal_canonical(aff::ParametrizedAffExpr{C}, other::ParametrizedAffExpr{C}) where {C} + aff_nozeros = dropzeros(aff) + other_nozeros = dropzeros(other) + # Note: This depends on equality of OrderedDicts ignoring order. + # This is the current behavior, but it seems questionable. + return isequal(aff_nozeros, other_nozeros) +end + +Base.convert(::Type{ParametrizedAffExpr{C}}, v::JuMP.VariableRef) where {C} = PAE{C}(GAEv{C}(zero(C), v => one(C)), zero(GAEp{C})) +Base.convert(::Type{ParametrizedAffExpr{C}}, v::Real) where {C} = PAE{C}(GAEv{C}(convert(C, v)), zero(GAEp{C})) + +# Check all coefficients are finite, i.e. not NaN, not Inf, not -Inf +function JuMP._assert_isfinite(a::ParametrizedAffExpr) + _assert_isfinite(v) + for (coef, par) in linear_terms(a.p) + isfinite(coef) || error("Invalid coefficient $coef on parameter $par.") + end +end + +JuMP.value(a::ParametrizedAffExpr) = JuMP.value(a, value) + +function JuMP.check_belongs_to_model(a::ParametrizedAffExpr, model::AbstractModel) + JuMP.check_belongs_to_model(a.v, model) + JuMP.check_belongs_to_model(a.p, model) +end + +# Note: No validation is performed that the variables in the AffExpr belong to +# the same model. The verification is done in `check_belongs_to_model` which +# should be called before calling `MOI.ScalarAffineFunction`. +# function MOI.ScalarAffineFunction(a::AffExpr) +# _assert_isfinite(a) +# terms = MOI.ScalarAffineTerm{Float64}[MOI.ScalarAffineTerm(t[1], +# index(t[2])) +# for t in linear_terms(a)] +# return MOI.ScalarAffineFunction(terms, a.constant) +# end +# moi_function(a::GenericAffExpr) = MOI.ScalarAffineFunction(a) +# function moi_function_type(::Type{<:GenericAffExpr{T}}) where T +# return MOI.ScalarAffineFunction{T} +# end + + +# Copy an affine expression to a new model by converting all the +# variables to the new model's variables +# TODO function Base.copy(a::GenericAffExpr, new_model::AbstractModel) + + +# Build constraint +# ------------------------------------------------------------------------------ + +# TODO should be in MOI, MOIU or JuMP +_shift_constant(set::S, offset) where S <: Union{MOI.LessThan,MOI.GreaterThan,MOI.EqualTo} = S(MOIU.getconstant(set) + offset) +function _shift_constant(set::MOI.Interval, offset) + MOI.Interval(set.lower + offset, set.upper + offset) +end + +function JuMP.build_constraint(_error::Function, aff::ParametrizedAffExpr, set::S) where S <: Union{MOI.LessThan,MOI.GreaterThan,MOI.EqualTo} + offset = aff.v.constant + aff.v.constant = 0.0 + shifted_set = _shift_constant(set, -offset) + return JuMP.ScalarConstraint(aff, shifted_set) +end + +function JuMP.build_constraint(_error::Function, aff::ParametrizedAffExpr, lb, ub) + JuMP.build_constraint(_error, aff, MOI.Interval(lb, ub)) +end + +function JuMP.add_constraint(m::JuMP.Model, c::PAEC{S}, name::String="") where S + + # build LinearConstraint + c_lin = JuMP.ScalarConstraint(c.func.v, c.set) + + # JuMP´s standard add_constrint + cref = JuMP.add_constraint(m, c_lin, name) + + data = _getparamdata(m)::ParameterData + + # needed for lazy get dual + if lazy_duals(data) + for (p, coef) in c.func.p.terms + push!(data.constraints_map[index(data, p)], ParametrizedConstraintRef(cref, coef)) + end + end + + # save the parameter part of a parametric affine expression + _get_param_dict(data, S)[cref] = c.func.p + + return cref +end + +# update constraint +# ------------------------------------------------------------------------------ + + +function _update_constraints(data::ParameterData) + _update_constraints(data, EQ) + _update_constraints(data, LE) + _update_constraints(data, GE) +end + +function _update_constraints(data::ParameterData, ::Type{S}) where S + for (cref, gaep) in _get_param_dict(data, S) + _update_constraint(data, cref, gaep) + end + return nothing +end + +function _update_constraint(data::ParameterData, cref, gaep::GAEp{C}) where C + val = 0.0 + @inbounds for (p, coef) in gaep.terms + ind = index(data, p) + val += coef * (data.future_values[ind] - data.current_values[ind]) + end + _update_constraint(data, cref, val) + return nothing +end + +function _update_constraint(data::ParameterData, cref, val::Number) + if !iszero(val) + ci = JuMP.index(cref) + old_set = MOI.get(cref.model.moi_backend, MOI.ConstraintSet(), ci) + # For scalar constraints, the constant in the function is zero and the + # constant is stored in the set. Since `pcr.coef` corresponds to the + # coefficient in the function, we need to take `-pcr.coef`. + new_set = _shift_constant(old_set, -val) + MOI.set(cref.model.moi_backend, MOI.ConstraintSet(), ci, new_set) + end + return nothing +end + +# Duals +# ------------------------------------------------------------------------------ + +function JuMP.dual(p::Parameter) + params = _getparamdata(p)::ParameterData + if lazy_duals(params) + return _getdual(p) + else + return params.dual_values[index(params, p)] + end +end + +# lazy + +function _getdual(p::Parameter) + return _getdual(p.model, index(p)) +end +function _getdual(pcr::ParametrizedConstraintRef)::Float64 + pcr.coef * JuMP.dual(pcr.cref) +end +function _getdual(model::JuMP.Model, ind::Integer) + params = _getparamdata(model)::ParameterData + # See (12) in http://www.juliaopt.org/MathOptInterface.jl/stable/apimanual.html#Duals-1 + # in the dual objective: - sum_i b_i^T y_i + # Here b_i depends on the param and is b_i' + coef * param + # so the dualobjective is: + # - sum_i b_i'^T y_i - param * sum_i coef^T y_i + # The dual of the parameter is: - sum_i coef^T y_i + return -sum(_getdual.(params.constraints_map[ind])) +end + +# non lazy + +function _update_duals(data::ParameterData) + if !lazy_duals(data) + fill!(data.dual_values, 0.0) + _update_duals(data, EQ) + _update_duals(data, GE) + _update_duals(data, LE) + end + return nothing +end + +function _update_duals(data::ParameterData, ::Type{S}) where S + for (cref, gaep) in _get_param_dict(data, S) + _update_duals(data, cref, gaep) + end + return nothing +end + +function _update_duals(data::ParameterData, cref, gaep) + dual_sol = JuMP.dual(cref) + @inbounds for (p, coef) in gaep.terms + data.dual_values[index(data, p)] -= coef * dual_sol + end + return nothing +end + +# constraint modification +# ------------------------------------------------------------------------------ + +function JuMP.set_coefficient(con::CtrRef{F, S}, p::Parameter, coef::Number) where {F<:SAF, S} + data = _getparamdata(p) + dict = _get_param_dict(data, S) + old_coef = 0.0 + ind = index(data, p) + if haskey(dict, con) + gaep = dict[con] + old_coef = get!(gaep.terms, p, 0.0) + gaep.terms[p] = coef + else + # TODO fix type C + dict[con] = GAEp{Float64}(zero(Float64), p => coef) + end + if !iszero(coef-old_coef) + val = (coef-old_coef)*data.current_values[ind] + _update_constraint(data, con, val) + if !iszero(data.future_values[ind] - data.current_values[ind]) + data.sync = false + end + end + if lazy_duals(data) + ctr_map = data.constraints_map[ind] + found_ctr = false + if isempty(ctr_map) + push!(ctr_map, ParametrizedConstraintRef(con, coef)) + else + for (p_ind, pctr) in enumerate(ctr_map) + if pctr.cref == con + if found_ctr + deleteat!(ctr_map, p_ind) + else + ctr_map[p_ind] = ParametrizedConstraintRef(con, coef) + end + found_ctr = true + end + end + end + if found_ctr && !iszero(data.future_values[ind]) + data.sync = false + end + end + nothing +end + +""" + delete_from_constraint(con, param::Parameter) + +Removes parameter `param` from constraint `con`. +""" +function delete_from_constraint(con::CtrRef{F, S}, p::Parameter) where {F, S} + data = _getparamdata(p) + dict = _get_param_dict(data, S) + ind = index(data, p) + if haskey(dict, con) + old_coef = get!(dict[con].terms, p, 0.0) + _update_constraint(data, con, (0.0-old_coef) * data.current_values[ind]) + delete!(dict[con].terms, p) + if !iszero(data.future_values[ind] - data.current_values[ind]) + data.sync = false + end + end + if lazy_duals(data) + ctr_map = data.constraints_map[ind] + found_ctr = false + for (index, pctr) in enumerate(ctr_map) + if pctr.cref == con + deleteat!(ctr_map, index) + found_ctr = true + end + end + if found_ctr && !iszero(data.future_values[ind]) + data.sync = false + end + end + nothing +end + +""" + delete_from_constraints(param::Parameter) + +Removes parameter `param` from all constraints. +""" +function delete_from_constraints(::Type{S}, p::Parameter) where S + data = _getparamdata(p) + dict = _get_param_dict(data, S) + ind = index(data, p) + for (con, gaep) in dict + if haskey(gaep.terms, p) + if !iszero(gaep.terms[p]) && !iszero(data.future_values[ind]) + data.sync = false + end + old_coef = get!(dict[con].terms, p, 0.0) + _update_constraint(data, con, (0.0-old_coef) * data.current_values[ind]) + delete!(gaep.terms, p) + end + end + if lazy_duals(data) + if !isempty(data.constraints_map[ind]) + data.constraints_map[ind] = ParametrizedConstraintRef[] + if !iszero(data.future_values[ind]) + data.sync = false + end + end + end + nothing +end + +function delete_from_constraints(param::Parameter) + delete_from_constraints(EQ, param) + delete_from_constraints(LE, param) + delete_from_constraints(GE, param) + # TODO lazy + nothing +end diff --git a/src/mutable_arithmetics.jl b/src/mutable_arithmetics.jl index f5def2e..533b216 100644 --- a/src/mutable_arithmetics.jl +++ b/src/mutable_arithmetics.jl @@ -80,9 +80,15 @@ end function JuMP.add_to_expression!(aff::PAE, new_coef, new_var::JuMP.VariableRef) JuMP.add_to_expression!(aff.v, new_coef, new_var) end +function JuMP.add_to_expression!(aff::PAE, new_var::JuMP.VariableRef) + JuMP.add_to_expression!(aff.v, new_var) +end function JuMP.add_to_expression!(aff::PAE, new_param::Parameter, new_coef) JuMP.add_to_expression!(aff.p, new_coef, new_param) end +function JuMP.add_to_expression!(aff::PAE, new_param::Parameter) + JuMP.add_to_expression!(aff.p,new_param) +end function JuMP.add_to_expression!(aff::PAE, new_coef, new_param::Parameter) JuMP.add_to_expression!(aff.p, new_coef, new_param) end \ No newline at end of file diff --git a/src/operators.jl b/src/operators.jl index 46e2530..873cef2 100644 --- a/src/operators.jl +++ b/src/operators.jl @@ -37,8 +37,9 @@ Base.:(+)(lhs::Parameter, rhs::Parameter) = PAE{Float64}(GAEv{Float64}(0.0), GAE Base.:(-)(lhs::Parameter, rhs::Parameter) = PAE{Float64}(GAEv{Float64}(0.0), GAEp{Float64}(0.0, lhs => 1.0, rhs => -1.0)) # Parameter--GAEp -# (+){C}(lhs::Parameter, rhs::GAEp{C})::GAEp{C} = GAEp{C}(vcat(rhs.vars,lhs),vcat(rhs.coeffs,one(C))) -# (-){C}(lhs::Parameter, rhs::GAEp{C})::GAEp{C} = GAEp{C}(vcat(rhs.vars,lhs),vcat(-rhs.coeffs,one(C))) +# this one is used internally only, becaus no other gunction returns a GAEp +Base.:(+)(lhs::Parameter, rhs::GAEp{C}) where C = (+)(GAEp{C}(zero(C), lhs => 1.0), rhs) +Base.:(-)(lhs::Parameter, rhs::GAEp{C}) where C = (+)(GAEp{C}(zero(C), lhs => 1.0), -rhs) # Parameter--GAEv/GenericAffExpr{C,VariableRef} Base.:(+)(lhs::Parameter, rhs::GAEv{C}) where {C} = PAE{C}(copy(rhs),GAEp{C}(zero(C), lhs => 1.)) @@ -61,8 +62,8 @@ Base.:(+)(lhs::JuMP.VariableRef, rhs::GAEp{C}) where {C} = PAE{C}(GAEv{C}(zero(C Base.:(-)(lhs::JuMP.VariableRef, rhs::GAEp{C}) where {C} = PAE{C}(GAEv{C}(zero(C), lhs => 1.),-rhs) # VariableRef--ParametrizedAffExpr{C} -Base.:(+)(lhs::JuMP.VariableRef, rhs::PAE{C}) where {C} = PAE{C}(GAEv{C}(zero(C), lhs => 1.),copy(rhs.p)) -Base.:(-)(lhs::JuMP.VariableRef, rhs::PAE{C}) where {C} = PAE{C}(GAEv{C}(zero(C), lhs => 1.),-rhs.p) +Base.:(+)(lhs::JuMP.VariableRef, rhs::PAE{C}) where {C} = PAE{C}(lhs + rhs.v, copy(rhs.p)) +Base.:(-)(lhs::JuMP.VariableRef, rhs::PAE{C}) where {C} = PAE{C}(lhs - rhs.v, -rhs.p) #= GenericAffExpr{C,VariableRef} @@ -89,11 +90,11 @@ Base.:(-)(lhs::GAEv{C}, rhs::PAE{C}) where {C} = PAE{C}(lhs-rhs.v,-rhs.p) # GenericAffExpr{C,Parameter}--VariableRef Base.:(+)(lhs::GAEp{C}, rhs::JuMP.VariableRef) where {C} = (+)(rhs,lhs) -Base.:(-)(lhs::GAEp{C}, rhs::JuMP.VariableRef) where {C} = (-)(-rhs,lhs) +Base.:(-)(lhs::GAEp{C}, rhs::JuMP.VariableRef) where {C} = (+)(-rhs,lhs) # GenericAffExpr{C,Parameter}--GenericAffExpr{C,VariableRef} Base.:(+)(lhs::GAEp{C}, rhs::GAEv{C}) where {C} = (+)(rhs,lhs) -Base.:(-)(lhs::GAEp{C}, rhs::GAEv{C}) where {C} = (-)(-rhs,lhs) +Base.:(-)(lhs::GAEp{C}, rhs::GAEv{C}) where {C} = (+)(-rhs,lhs) # GenericAffExpr{C,Parameter}--GenericAffExpr{C,Parameter} # DONE in JuMP @@ -106,23 +107,25 @@ Base.:(-)(lhs::GAEp{C}, rhs::PAE{C}) where {C} = PAE{C}(-rhs.v,lhs-rhs.p) ParametrizedAffExpr{C} =# +Base.:(-)(lhs::PAE{C}) where C = PAE{C}(-lhs.v, -lhs.p) + # Number--PAE Base.:(+)(lhs::PAE, rhs::Number) = (+)(rhs,lhs) -Base.:(-)(lhs::PAE, rhs::Number) = (-)(-rhs,lhs) +Base.:(-)(lhs::PAE, rhs::Number) = (+)(-rhs,lhs) Base.:(*)(lhs::PAE, rhs::Number) = (*)(rhs,lhs) # ParametrizedAffExpr{C}--Parameter Base.:(+)(lhs::PAE{C}, rhs::Parameter) where {C} = (+)(rhs,lhs) -Base.:(-)(lhs::PAE{C}, rhs::Parameter) where {C} = (-)(-rhs,lhs) +Base.:(-)(lhs::PAE{C}, rhs::Parameter) where {C} = (+)(-rhs,lhs) # VariableRef--ParametrizedAffExpr{C} Base.:(+)(lhs::PAE{C}, rhs::JuMP.VariableRef) where {C} = (+)(rhs,lhs) -Base.:(-)(lhs::PAE{C}, rhs::JuMP.VariableRef) where {C} = (-)(-rhs,lhs) +Base.:(-)(lhs::PAE{C}, rhs::JuMP.VariableRef) where {C} = (+)(-rhs,lhs) # ParametrizedAffExpr{C}--GenericAffExpr{C,VariableRef} # ParametrizedAffExpr{C}--GenericAffExpr{C,Parameter} Base.:(+)(lhs::PAE{C}, rhs::GAE{C,V}) where {C,V} = (+)(rhs,lhs) -Base.:(-)(lhs::PAE{C}, rhs::GAE{C,V}) where {C,V} = (-)(-rhs,lhs) +Base.:(-)(lhs::PAE{C}, rhs::GAE{C,V}) where {C,V} = (+)(-rhs,lhs) # ParametrizedAffExpr{C}--ParametrizedAffExpr{C} Base.:(+)(lhs::PAE{C}, rhs::PAE{C}) where {C} = PAE{C}(lhs.v+rhs.v,lhs.p+rhs.p) diff --git a/src/print.jl b/src/print.jl index 0125c3d..e5d774d 100644 --- a/src/print.jl +++ b/src/print.jl @@ -1,3 +1,43 @@ +function JuMP.function_string(::Type{REPLMode}, p::Parameter) + par_name = name(p) + if !isempty(par_name) + return par_name + else + return "noname_param" + end +end + +function JuMP.function_string(::Type{IJuliaMode}, p::Parameter) + par_name = name(p) + if !isempty(par_name) + # TODO: This is wrong if parameter name constains extra "]" + return replace(replace(par_name, "[" => "_{", count = 1), "]" => "}") + else + return "nonameParam" + end +end + function JuMP.function_string(mode, a::ParameterJuMP.ParametrizedAffExpr, show_constant=true) - return JuMP.function_string(mode, a.v, show_constant) + ret = "" + str_v = JuMP.function_string(mode, a.v, false) + if str_v != "0" + ret = str_v + end + str_p = JuMP.function_string(mode, a.p, false) + if str_p != "0" + if ret == "" + ret = str_p + else + if str_p[1] == '-' + ret = ret * " - " * str_p[2:end] + else + ret = ret * " + " * str_p + end + end + end + if !JuMP._is_zero_for_printing(a.v.constant) && show_constant + ret = string(ret, JuMP._sign_string(a.v.constant), + JuMP._string_round(abs(a.v.constant))) + end + return ret end \ No newline at end of file diff --git a/src/variable_interface.jl b/src/variable_interface.jl new file mode 100644 index 0000000..e846084 --- /dev/null +++ b/src/variable_interface.jl @@ -0,0 +1,178 @@ + +# JuMP Variable interface +# ------------------------------------------------------------------------------ + +# main interface + +JuMP.is_fixed(p::Parameter) = true +JuMP.fix_index(p::Parameter) = + error("Parameters do not have have explicit constraints, hence no constraint index.") +JuMP.set_fix_index(p::Parameter, cindex) = + error("Parameters do not have have explicit constraints, hence no constraint index.") +function JuMP.fix(p::Parameter, val::Real) + data = _getparamdata(p)::ParameterData + data.sync = false + data.future_values[index(data, p)] = val + return nothing +end +JuMP.unfix(p::Parameter) = error("Parameters cannot be unfixed.") +function JuMP.fix_value(p::Parameter) + data = _getparamdata(p)::ParameterData + data.future_values[index(data, p)] +end +JuMP.FixRef(p::Parameter) = + error("Parameters do not have have explicit constraints, hence no constraint reference.") + + +function JuMP.fix(p::Parameter, val::Real) + data = _getparamdata(p)::ParameterData + data.sync = false + data.future_values[index(data, p)] = val + return nothing +end + +function JuMP.value(p::Parameter) + data = _getparamdata(p)::ParameterData + data.future_values[index(data, p)] +end + +# interface continues + +JuMP.owner_model(p::Parameter) = p.model + +struct ParameterNotOwned <: Exception + parameter::Parameter +end + +function JuMP.check_belongs_to_model(p::Parameter, model::AbstractModel) + if owner_model(p) !== model + throw(ParameterNotOwned(p)) + end +end + +Base.iszero(::Parameter) = false +Base.copy(p::Parameter) = Parameter(p.ind, p.model) +# Base.broadcastable(v::VariableRef) = Ref(v) # NEEDED??? + +""" + delete(model::Model, param::Parameter) + +Delete the parameter `param` from the model `model`. + +Note. +After the first deletion you might experience performance reduction. +Therefore, only use thid command if there is no other way around. +""" +function JuMP.delete(model::Model, param::Parameter) + error("Parameters can be deleted currently.") + if model !== owner_model(param) + error("The variable reference you are trying to delete does not " * + "belong to the model.") + end + # create dictionary map + # turn flag has_deleted + # delete names ? +end + +""" + is_valid(model::Model, parameter::Parameter) + +Return `true` if `parameter` refers to a valid parameter in `model`. +""" +function JuMP.is_valid(model::Model, parameter::Parameter) + return model === owner_model(parameter) +end + +# The default hash is slow. It's important for the performance of AffExpr to +# define our own. +# https://github.com/JuliaOpt/MathOptInterface.jl/issues/234#issuecomment-366868878 +function Base.hash(p::Parameter, h::UInt) + return hash(objectid(owner_model(p)), hash(p.ind, h)) +end +function Base.isequal(p1::Parameter, p2::Parameter) + return owner_model(p1) === owner_model(p2) && p1.ind == p2.ind +end + +# index(p::Parameter) = v.ind + +function JuMP.name(p::Parameter) + dict = _getparamdata(p).names + if haskey(dict, p) + return dict[p] + else + return "" + end +end + +function JuMP.set_name(p::Parameter, s::String) + dict = _getparamdata(p).names + dict[p] = s +end + +function parameter_by_name(model::Model, name::String) + # can be improved with a lazy rev_names map + dict = _getparamdata(model).names + for (par, n) in dict + if n == name + return par + end + end + return nothing +end + +JuMP.has_lower_bound(p::Parameter) = false +JuMP.LowerBoundRef(p::Parameter) = + error("Parameters do not have bounds.") +JuMP.lower_bound_index(p::Parameter) = + error("Parameters do not have bounds.") +JuMP.set_lower_bound_index(p::Parameter, cindex) = + error("Parameters do not have bounds.") +JuMP.set_lower_bound(p::Parameter, lower::Number) = + error("Parameters do not have bounds.") +JuMP.delete_lower_bound(p::Parameter) = + error("Parameters do not have bounds.") +JuMP.lower_bound(p::Parameter) = + error("Parameters do not have bounds.") + +JuMP.has_upper_bound(p::Parameter) = false +JuMP.UpperBoundRef(p::Parameter) = + error("Parameters do not have bounds.") +JuMP.upper_bound_index(p::Parameter) = + error("Parameters do not have bounds.") +JuMP.set_upper_bound_index(p::Parameter, cindex) = + error("Parameters do not have bounds.") +JuMP.set_upper_bound(p::Parameter, lower::Number) = + error("Parameters do not have bounds.") +JuMP.delete_upper_bound(p::Parameter) = + error("Parameters do not have bounds.") +JuMP.upper_bound(p::Parameter) = + error("Parameters do not have bounds.") + +JuMP.is_integer(p::Parameter) = false +JuMP.integer_index(p::Parameter) = + error("Parameters do not have integrality constraints.") +JuMP.set_integer_index(p::Parameter, cindex) = + error("Parameters do not have integrality constraints.") +JuMP.set_integer(p::Parameter) = + error("Parameters do not have integrality constraints.") +JuMP.unset_integer(p::Parameter) = + error("Parameters do not have integrality constraints.") +JuMP.IntegerRef(p::Parameter) = + error("Parameters do not have integrality constraints.") + +JuMP.is_binary(p::Parameter) = false +JuMP.binary_index(p::Parameter) = + error("Parameters do not have binary constraints.") +JuMP.set_binary_index(p::Parameter, cindex) = + error("Parameters do not have binary constraints.") +JuMP.set_binary(p::Parameter) = + error("Parameters do not have binary constraints.") +JuMP.unset_binary(p::Parameter) = + error("Parameters do not have binary constraints.") +JuMP.BinaryRef(p::Parameter) = + error("Parameters do not have binary constraints.") + +JuMP.start_value(p::Parameter) = + error("Parameters do not have start values.") +JuMP.set_start_value(p::Parameter, value::Number) = + error("Parameters do not have start values.") diff --git a/test/runtests.jl b/test/runtests.jl index 906640a..f4e1213 100644 --- a/test/runtests.jl +++ b/test/runtests.jl @@ -25,5 +25,6 @@ include("tests.jl") test11(factory) test12(factory) test13(factory) + test14(factory) end ; diff --git a/test/tests.jl b/test/tests.jl index fc38dfe..14799b9 100644 --- a/test/tests.jl +++ b/test/tests.jl @@ -51,7 +51,7 @@ function test2(args...) @testset "LessThan modification" begin model = ModelWithParams(args...) α = Parameter(model, 1.0) - ParameterJuMP.setvalue!(α, -1.0) + fix(α, -1.0) @variable(model, x) cref = @constraint(model, x ≤ α) @objective(model, Max, x) @@ -60,7 +60,7 @@ function test2(args...) @test JuMP.dual(cref) == -1.0 @test JuMP.dual(α) == -1.0 - ParameterJuMP.setvalue!(α, 2.0) + fix(α, 2.0) JuMP.optimize!(model) @test JuMP.value(x) == 2.0 @test JuMP.dual(cref) == -1.0 @@ -72,7 +72,7 @@ function test3(args...) @testset "EqualTO modification" begin model = ModelWithParams(args...) α = Parameter(model, 1.0) - ParameterJuMP.setvalue!(α, -1.0) + fix(α, -1.0) @variable(model, x) cref = @constraint(model, x == α) @objective(model, Max, x) @@ -81,7 +81,7 @@ function test3(args...) @test JuMP.dual(cref) == -1.0 @test JuMP.dual(α) == -1.0 - ParameterJuMP.setvalue!(α, 2.0) + fix(α, 2.0) JuMP.optimize!(model) @test JuMP.value(x) == 2.0 @test JuMP.dual(cref) == -1.0 @@ -93,7 +93,7 @@ function test4(args...) @testset "GreaterThan modification" begin model = ModelWithParams(args...) α = Parameter(model, 1.0) - ParameterJuMP.setvalue!(α, -1.0) + fix(α, -1.0) @variable(model, x) cref = @constraint(model, x >= α) @objective(model, Min, x) @@ -102,7 +102,7 @@ function test4(args...) @test JuMP.dual(cref) == 1.0 @test JuMP.dual(α) == 1.0 - ParameterJuMP.setvalue!(α, 2.0) + fix(α, 2.0) JuMP.optimize!(model) @test JuMP.value(x) == 2.0 @test JuMP.dual(cref) == 1.0 @@ -193,7 +193,7 @@ function test10(args...) @testset "Add after solve" begin model = ModelWithParams(args...) α = Parameter(model, 1.0) - ParameterJuMP.setvalue!(α, -1.0) + fix(α, -1.0) @variable(model, x) cref = @constraint(model, x <= α) @objective(model, Max, x) @@ -232,7 +232,7 @@ function test11(args...) @testset "Change coefficient" begin model = ModelWithParams(args...) α = Parameter(model, 1.0) - ParameterJuMP.setvalue!(α, -1.0) + fix(α, -1.0) @variable(model, x) cref = @constraint(model, x >= α) @objective(model, Min, x) @@ -290,7 +290,7 @@ function test12(args...) @testset "Remove parameter from constraint" begin model = ModelWithParams(args...) α = Parameter(model, 1.0) - ParameterJuMP.setvalue!(α, -1.0) + fix(α, -1.0) @variable(model, x) cref = @constraint(model, x >= α) @objective(model, Min, x) @@ -308,7 +308,7 @@ function test12(args...) @testset "Remove parameter from all constraint" begin model = ModelWithParams(args...) α = Parameter(model, 1.0) - ParameterJuMP.setvalue!(α, -1.0) + fix(α, -1.0) @variable(model, x) cref = @constraint(model, x >= α) @objective(model, Min, x) @@ -330,17 +330,94 @@ function test13(args...) model = ModelWithParams(args...) ParameterJuMP.set_no_duals(model) α = Parameter(model, 1.0) - ParameterJuMP.setvalue!(α, -1.0) + fix(α, -1.0) @variable(model, x) cref = @constraint(model, x == α) @objective(model, Max, x) JuMP.optimize!(model) @test JuMP.value(x) == -1.0 @test JuMP.dual(cref) == -1.0 - @test_throws ErrorException JuMP.dual(α) + @test isnan(JuMP.dual(α)) model_2 = ModelWithParams(args...) y = Parameter(model_2, 1.0) @test_throws ErrorException ParameterJuMP.set_no_duals(model_2) end +end + +using SparseArrays + +macro test_expression(expr) + esc(quote + @test JuMP.isequal_canonical(@expression(m, $expr), $expr) + end) +end + +macro test_expression_with_string(expr, str) + esc(quote + @test string(@inferred $expr) == $str + @test_expression $expr + end) +end + +function test14(args...) + @testset "Test ParametrizedAffExpr" begin + m = ModelWithParams() + @variable(m, x) + @variable(m, y) + a = Parameter(m) + @test name(a) == "" + set_name(a, "a") + @test name(a) == "a" + b = Parameter(m) + set_name(b, "b") + + exp1 = x + y + a + @test typeof(exp1) == ParameterJuMP.ParametrizedAffExpr{Float64} + @test length(exp1.v.terms) == 2 + exp1 = exp1 + y + @test length(exp1.v.terms) == 2 + + @test iszero(ParameterJuMP.ParametrizedAffExpr{Float64}()) + @test iszero(zero(exp1)) + @test iszero(one(exp1) - one(ParameterJuMP.ParametrizedAffExpr{Float64})) + @test iszero(SparseArrays.dropzeros((exp1 - copy(exp1)))) + + empty_func(empty_arg) = 0 + exp2 = map_coefficients(Base.zero, exp1) + @test iszero(SparseArrays.dropzeros(exp2)) + @test iszero(constant(exp2)) + + @test isequal_canonical(exp1, copy(exp1)) + + exp4 = exp1 - copy(exp1) + @test iszero(SparseArrays.dropzeros(exp4)) + + # var + num + @test_expression_with_string 7.1 * x + 2.5 "7.1 x + 2.5" + @test_expression_with_string 1.2 * y + 1.2 "1.2 y + 1.2" + # par + num + @test_expression_with_string 7.1 * a + 2.5 "7.1 a + 2.5" + @test_expression_with_string 1.2 * b + 1.2 "1.2 b + 1.2" + + # par + par + num + @test_expression_with_string b + a + 1.2 "b + a + 1.2" + # par - par + num + @test_expression_with_string b - a + 1.2 "b - a + 1.2" + # par - (par - num) + @test_expression_with_string b - (a - 1.2) "b - a + 1.2" + # par - par - num + @test_expression_with_string b - a - 1.2 "b - a - 1.2" + # var + par + num + @test_expression_with_string x + a + 1.2 "x + a + 1.2" + # var + var + par + num + @test_expression_with_string x + y + a + 1.2 "x + y + a + 1.2" + # var + var - par + num + @test_expression_with_string x + y - a + 1.2 "x + y - a + 1.2" + # var + var - par + par + num + @test_expression_with_string x + y - a + b + 1.2 "x + y + b - a + 1.2" + # par + par - var + var + num + @test_expression_with_string a + b - x + y + 1.2 "y - x + a + b + 1.2" + + end end \ No newline at end of file