Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add an example of column generation #2006

Closed
wants to merge 21 commits into from

Conversation

dourouc05
Copy link
Contributor

Follows #2004

@codecov
Copy link

codecov bot commented Jul 9, 2019

Codecov Report

Merging #2006 into master will decrease coverage by 0.03%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff            @@
##           master   #2006      +/-   ##
=========================================
- Coverage   89.13%   89.1%   -0.04%     
=========================================
  Files          33      33              
  Lines        4324    4312      -12     
=========================================
- Hits         3854    3842      -12     
  Misses        470     470
Impacted Files Coverage Δ
src/aff_expr.jl 85.82% <0%> (-0.75%) ⬇️
src/quad_expr.jl 89.03% <0%> (-0.65%) ⬇️
src/objective.jl 95.23% <0%> (+4.32%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 010fccf...4c19ef8. Read the comment docs.

Copy link
Member

@odow odow left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please follow the style of the other examples: place everything inside a function, and add an informative docstring.

@dourouc05
Copy link
Contributor Author

Here is an update example, with a lot more description (the problem to solve, a general description of the technique), but I did not write a column-generation tutorial. However, I did not change the dividing in functions, as there is just a main function and the pricing problem: merging the two would make things less clear, I fear.

@matbesancon
Copy link
Contributor

@dourouc05 cool that you're using it. Instead of a repo I will archive both the post and code in a reproducible state (with Manifest and all) to cite something stable


using JuMP, GLPK
using SparseArrays
const MOI = JuMP.MathOptInterface
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is already exported by JuMP

@matbesancon
Copy link
Contributor

@dourouc05 you can use 10.5281/zenodo.3329388 as DOI or:

@misc{mathieu_besancon_2019_3329389,
  author       = {Mathieu Besançon},
  title        = {{A column generation example in Julia and JuMP}},
  month        = jul,
  year         = 2019,
  doi          = {10.5281/zenodo.3329388},
  url          = {https://doi.org/10.5281/zenodo.3329388}
}

@dourouc05
Copy link
Contributor Author

I did not add the full BibTeX entry, as it is easy enough to find from Zenodo. Is that good enough for you?

@matbesancon
Copy link
Contributor

Yup all good thanks

examples/cutting_stock_column_generation.jl Show resolved Hide resolved
examples/cutting_stock_column_generation.jl Show resolved Hide resolved
new_pattern = try
solve_pricing(dual.(demand_satisfaction), maxwidth, widths, rollcost, demand, prices)
catch
# At the final iteration, GLPK has dual values, but at least one of them is 0.0, and thus GLPK crashes.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why does a dual value of 0.0 cause GLPK to crash? That shouldn't happen.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, it's more GPLK.jl that gives an error (Cbc.jl too):

ERROR: LoadError: AssertionError: dual >= 0.0
Stacktrace:
 [1] __assert_dual_sense__ at C:\Users\…\.julia\packages\LinQuadOptInterface\ZMx9f\src\solve.jl:210 [inlined]
 [2] get at C:\Users\…\.julia\packages\LinQuadOptInterface\ZMx9f\src\solve.jl:217 [inlined]
 [3] get(::MathOptInterface.Bridges.LazyBridgeOptimizer{GLPK.Optimizer,MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Bridges.AllBridgedConstraints{Float64}}}, ::MathOptInterface.ConstraintDual, ::MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}}) at C:\Users\…\.julia\packages\MathOptInterface\C3lip\src\Bridges\bridgeoptimizer.jl:236
 [4] get(::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer,MathOptInterface.Utilities.UniversalFallback{JuMP._MOIModel{Float64}}}, ::MathOptInterface.ConstraintDual, ::MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}}) at C:\Users\…\.julia\packages\MathOptInterface\C3lip\src\Utilities\cachingoptimizer.jl:453
 [5] _moi_get_result(::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer,MathOptInterface.Utilities.UniversalFallback{JuMP._MOIModel{Float64}}}, ::MathOptInterface.ConstraintDual, ::Vararg{Any,N} where N) at C:\Users\…\.julia\dev\JuMP\src\JuMP.jl:618
 [6] get(::Model, ::MathOptInterface.ConstraintDual, ::ConstraintRef{Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}},ScalarShape}) at C:\Users\…\.julia\dev\JuMP\src\JuMP.jl:649
 [7] _constraint_dual(::ConstraintRef{Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}},ScalarShape}) at C:\Users\…\.julia\dev\JuMP\src\constraints.jl:545
 [8] dual(::ConstraintRef{Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}},ScalarShape}) at C:\Users\…\.julia\dev\JuMP\src\constraints.jl:538
 [9] _broadcast_getindex_evalf at .\broadcast.jl:625 [inlined]
 [10] _broadcast_getindex at .\broadcast.jl:598 [inlined]
 [11] getindex at .\broadcast.jl:558 [inlined]
 [12] copyto_nonleaf!(::Array{Float64,1}, ::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{1},Tuple{Base.OneTo{Int64}},typeof(dual),Tuple{Base.Broadcast.Extruded{Array{ConstraintRef{Model,C,Shape} where Shape<:AbstractShape where C,1},Tuple{Bool},Tuple{Int64}}}}, ::Base.OneTo{Int64}, ::Int64, ::Int64) at .\broadcast.jl:982
 [13] copy at .\broadcast.jl:836 [inlined]
 [14] materialize at .\broadcast.jl:798 [inlined]
 [15] #example_cutting_stock#3(::Int64, ::typeof(example_cutting_stock)) at C:\Users\…\Desktop\a.jl:126
 [16] example_cutting_stock() at C:\Users\…\Desktop\a.jl:88
 [17] top-level scope at C:\Users\…\Desktop\a.jl:175

@dourouc05 dourouc05 force-pushed the dourouc05/example-colgen branch from 9b23c69 to 790da29 Compare July 18, 2019 16:24
@dourouc05 dourouc05 force-pushed the dourouc05/example-colgen branch from 790da29 to 4c19ef8 Compare July 18, 2019 16:33
@dourouc05
Copy link
Contributor Author

I'm messing up with the rebase, I'll open a new PR.

@dourouc05 dourouc05 closed this Jul 18, 2019
dourouc05 added a commit to dourouc05/JuMP.jl that referenced this pull request Jul 18, 2019
dourouc05 added a commit to dourouc05/JuMP.jl that referenced this pull request Sep 7, 2019
dourouc05 added a commit to dourouc05/JuMP.jl that referenced this pull request Sep 7, 2019
mlubin pushed a commit that referenced this pull request Sep 7, 2019
* Restore the old PR.

#2006

* Use the new function set_objective_coefficient.

* Add the standard license header.

* Remove the workaround for current GLPK.jl.

* Update to JuMP 0.20.
@dourouc05 dourouc05 deleted the dourouc05/example-colgen branch September 7, 2019 01:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

7 participants