Matrix manipulation and optimization

Przemysław Szufel

pwd() # Ctrl + ENTER
"C:\\AAABIBLIOTEKA\\MIT_Boston\\Stuttgart"
versioninfo()
Julia Version 1.9.3
Commit bed2cd540a (2023-08-24 14:43 UTC)
Build Info:
  Official https://julialang.org/ release
Platform Info:
  OS: Windows (x86_64-w64-mingw32)
  CPU: 12 × 13th Gen Intel(R) Core(TM) i7-1355U
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-14.0.6 (ORCJIT, goldmont)
  Threads: 2 on 12 virtual cores
Environment:
  JULIA_DEPOT_PATH = c:\JuliaPkg\Julia-1.9.3
  JULIA_HOME = c:\Julia-1.9.3
  JULIA_VERSION = Julia-1.9.3

Working with matrices

Parametric data types in vectors and matrices

v1 = [1,4,5]
3-element Vector{Int64}:
 1
 4
 5
typeof(v1)
Vector{Int64} (alias for Array{Int64, 1})
v2 = [1.1, 2.2]
2-element Vector{Float64}:
 1.1
 2.2
typeof(v2)
Vector{Float64} (alias for Array{Float64, 1})
eltype(v1),  eltype(v2)
(Int64, Float64)
b = Int[]
append!(b, 5)
1-element Vector{Int64}:
 5
c = []
append!(c, 5)
1-element Vector{Any}:
 5
c = Complex(1,4)
1 + 4im
typeof(c)
Complex{Int64}
dump(c)
Complex{Int64}
  re: Int64 1
  im: Int64 4
c2=Complex{Float64}(1,4)
1.0 + 4.0im
typeof(c2)
ComplexF64 (alias for Complex{Float64})
c2=Complex{Float32}(1,4)
typeof(c2)
ComplexF32 (alias for Complex{Float32})
x = 1//2 + 1//4
3//4
typeof(x)
Rational{Int64}
c3=Complex{Rational{Int128}}(1//4,4)
@show c3
c3 = 1//4 + 4//1*im
1//4 + 4//1*im
Complex{Float64}(c3)
0.25 + 4.0im
zeros(Int, 2,3)
2×3 Matrix{Int64}:
 0  0  0
 0  0  0
Matrix{Float64}(undef,40, 30 )
40×30 Matrix{Float64}:
 1.18028e288   3.97085e246     1.12252e-311  …  1.24186e-308  1.11097e-308
 8.37171e-144  8.88243e247     8.48798e-314     1.16228e-308  2.80426e-309
 7.7282e-91    2.15799e243     1.12252e-311     1.11368e-308  2.81495e-309
 1.11032e-47   1.14428e243     0.0              1.39053e-308  2.79554e-309
 3.42242e126   1.35617e248   NaN                1.21694e-308  2.77375e-309
 3.75598e199   0.0             1.12252e-311  …  1.21446e-308  2.80379e-309
 1.12247e-311  0.0             1.12252e-311     1.23045e-308  0.0
 1.12252e-311  1.12041e-311    2.122e-314       1.2191e-308   1.4307e-308
 1.12252e-311  1.12252e-311    1.12041e-311     1.22031e-308  1.40223e-308
 4.0e-323      1.35808e-312    1.12252e-311     3.95351e-309  1.431e-308
 6.49269e169   1.12149e-311    1.12252e-311  …  4.09709e-309  1.40176e-308
 1.12247e-311  1.16095e-28     1.12252e-311     1.41472e-308  1.43129e-308
 0.0           5.80815e286   NaN                4.53233e-309  1.25379e-308
 ⋮                                           ⋱                
 1.12252e-311  0.0             0.0              1.25441e-308  0.0
 1.12252e-311  0.0             0.0              1.71226e-309  0.0
 1.12252e-311  1.12252e-311  NaN             …  2.73984e-309  0.0
 1.56246e161   1.12252e-311    1.12252e-311     1.07513e-308  0.0
 1.12041e-311  1.12252e-311    1.12252e-311     1.26064e-308  0.0
 1.12252e-311  1.12252e-311    2.122e-314       1.73581e-309  0.0
 4.24399e-314  4.0e-323        1.12041e-311     2.79838e-309  0.0
 4.0e-323      0.0             1.12247e-311  …  1.37498e-308  0.0
 1.86078e160   1.12252e-311    0.0              1.10946e-308  0.0
 1.12213e-311  1.12252e-311    0.0              1.10984e-308  0.0
 1.12149e-311  1.12252e-311    1.12252e-311     1.38799e-308  0.0
 1.16466e-28   0.0             0.0              1.23517e-308  0.0
@show Array{Any}(undef, 2, 3)   # 2x3 Matrix of Any
@show zeros(5)              # vector of Float64 zeros
@show ones(Int64, 2, 1)     # 2x1 array of Int64 ones
@show trues(3), falses(3)   # tuple of vector of trues and of falses

@show x = range(1, stop=2, length=5)  # iterator having 5 equally spaced elements
@show collect(x)    # converts iterator to vector
@show 1:10          # iterable from 1 to 10
@show 1:2:10        # iterable from 1 to 9 with 2 skip
@show reshape(1:12, 3, 4)   # 3x4 array filled with 1:12 values
Array{Any}(undef, 2, 3) = Any[#undef #undef #undef; #undef #undef #undef]
zeros(5) = [0.0, 0.0, 0.0, 0.0, 0.0]
ones(Int64, 2, 1) = [1; 1;;]
(trues(3), falses(3)) = (Bool[1, 1, 1], Bool[0, 0, 0])
x = range(1, stop = 2, length = 5) = 1.0:0.25:2.0
collect(x) = [1.0, 1.25, 1.5, 1.75, 2.0]
1:10 = 1:10
1:2:10 = 1:2:9
reshape(1:12, 3, 4) = [1 4 7 10; 2 5 8 11; 3 6 9 12]
3×4 reshape(::UnitRange{Int64}, 3, 4) with eltype Int64:
 1  4  7  10
 2  5  8  11
 3  6  9  12
Matrix
Matrix (alias for Array{T, 2} where T)
y = Matrix{Int64}(undef, 2, 3)
2×3 Matrix{Int64}:
 51539607568  51539607564  30064771079
 81604378641  30064771084  30064771079
m = zeros( 4, 5)
m[1,3] = 66
m
# 1-based numbering 
4×5 Matrix{Float64}:
 0.0  0.0  66.0  0.0  0.0
 0.0  0.0   0.0  0.0  0.0
 0.0  0.0   0.0  0.0  0.0
 0.0  0.0   0.0  0.0  0.0
m[1:end-1,3] #array slicing
3-element Vector{Float64}:
 66.0
  0.0
  0.0
a = [1,2,3,4]
b = a'
b[1,1] = 99
a
4-element Vector{Int64}:
 99
  2
  3
  4
a = reshape(1:12, 3, 4)
display(a[:, 3:end]) # 3x2 matrix
display(a[:, 1]) # 3 element vector
display(a[1, :]) # 4 element vector
3×2 Matrix{Int64}:
 7  10
 8  11
 9  12
3-element Vector{Int64}:
 1
 2
 3
4-element Vector{Int64}:
  1
  4
  7
 10

Vectorization of operators

Bool.( [1 2] .< [2 1])
1×2 BitMatrix:
 1  0
[1 2 3] .< [4 2 2]
1×3 BitMatrix:
 1  0  0
2 .+ [1 2 3; 4 5 6] 
2×3 Matrix{Int64}:
 3  4  5
 6  7  8
2 .+ [1 2 3; 4 5 6] 
2×3 Matrix{Int64}:
 3  4  5
 6  7  8
f(a,b) = a < b + 1 ? a : b
f (generic function with 1 method)
f.([1 2 3], [4 2 2])
1×3 Matrix{Int64}:
 1  2  2
# dot operator :  . 
[1,2,3] .< [4,2,1]
3-element BitVector:
 1
 0
 0
"""
def f(x,y):
   return 2*x if x < y else -1
"""
"def f(x,y):\n   return 2*x if x < y else -1\n"
f(x,y) =  x<y ? 2x : -1
f (generic function with 1 method)
f(10,30)
20
a=  [1,2,3]
b=  [0,5,6]
3-element Vector{Int64}:
 0
 5
 6
f.(a,b)
3-element Vector{Int64}:
 -1
  4
  6
a .* b
3-element Vector{Int64}:
  0
 10
 18

Working with sparse matrices and unmaterialized data

using SparseArrays
using LinearAlgebra
a = sparse(I, 10, 10)
10×10 SparseMatrixCSC{Bool, Int64} with 10 stored entries:
 1  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅
 ⋅  1  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅
 ⋅  ⋅  1  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅
 ⋅  ⋅  ⋅  1  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅
 ⋅  ⋅  ⋅  ⋅  1  ⋅  ⋅  ⋅  ⋅  ⋅
 ⋅  ⋅  ⋅  ⋅  ⋅  1  ⋅  ⋅  ⋅  ⋅
 ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  1  ⋅  ⋅  ⋅
 ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  1  ⋅  ⋅
 ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  1  ⋅
 ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  1
b = Array(a)
10×10 Matrix{Bool}:
 1  0  0  0  0  0  0  0  0  0
 0  1  0  0  0  0  0  0  0  0
 0  0  1  0  0  0  0  0  0  0
 0  0  0  1  0  0  0  0  0  0
 0  0  0  0  1  0  0  0  0  0
 0  0  0  0  0  1  0  0  0  0
 0  0  0  0  0  0  1  0  0  0
 0  0  0  0  0  0  0  1  0  0
 0  0  0  0  0  0  0  0  1  0
 0  0  0  0  0  0  0  0  0  1
c = sparse(b)
10×10 SparseMatrixCSC{Bool, Int64} with 10 stored entries:
 1  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅
 ⋅  1  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅
 ⋅  ⋅  1  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅
 ⋅  ⋅  ⋅  1  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅
 ⋅  ⋅  ⋅  ⋅  1  ⋅  ⋅  ⋅  ⋅  ⋅
 ⋅  ⋅  ⋅  ⋅  ⋅  1  ⋅  ⋅  ⋅  ⋅
 ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  1  ⋅  ⋅  ⋅
 ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  1  ⋅  ⋅
 ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  1  ⋅
 ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  1
rr = sprand(1000_000,1000_000,0.000_000_1)
1000000×1000000 SparseMatrixCSC{Float64, Int64} with 99870 stored entries:
⎡⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎤
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎣⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎦
using Statistics
Statistics.mean(rr)
4.992023180743609e-8
rr2 = sprand(1000_000,1000_000,0.000_000_01)
1000000×1000000 SparseMatrixCSC{Float64, Int64} with 10050 stored entries:
⎡⣒⣺⡴⠳⣸⠲⢿⢼⡾⢙⣫⣒⣶⠷⣻⣶⣼⢿⡵⢟⣽⡮⣳⡾⣾⣿⡷⢫⣷⣥⣿⣳⢾⡖⣿⣎⣷⡿⡷⡦⎤
⎢⡼⣿⡿⢫⢶⡽⢊⢿⣬⣛⢷⢿⣾⣟⡫⣾⣟⢯⢞⣿⣿⠷⣿⣻⣾⠯⣾⣯⡵⣦⣿⠿⠗⣶⡺⣾⣿⣟⣷⡟⎥
⎢⠛⣯⢺⡦⡾⣷⣿⣵⣖⣝⣜⣿⣿⣾⣿⡯⣏⣿⡋⣾⡿⢳⣵⣿⠽⣽⣿⣭⡿⡏⣗⣽⣬⣿⣍⣿⣧⡩⣧⣞⎥
⎢⣾⣹⣿⡡⣽⠿⣾⣧⣼⠻⣻⢷⣞⡏⣿⡿⣿⢮⢿⢺⡷⡯⡿⣿⢻⢻⣟⣿⣻⣹⡛⣴⣿⡞⣿⣽⣩⣮⣏⣇⎥
⎢⠋⢿⢳⡱⣩⣿⣷⢷⢿⢄⣩⣽⡻⣿⡿⡫⡿⡟⣯⡷⣿⠷⡿⡿⣯⣳⠼⣯⣫⢿⡟⣿⣯⣟⢻⢿⣯⢛⣷⢨⎥
⎢⢺⣫⣽⣹⣾⣽⣿⣿⢯⣻⢻⢟⣻⣝⣗⢟⠻⣻⣯⣫⡶⣵⣝⠶⣿⡿⣟⣧⣗⣦⢵⢾⣾⣻⣗⣿⡻⡿⣿⡯⎥
⎢⣼⣧⣽⡏⣷⣿⢏⣿⣿⣿⢾⣗⣍⣏⣷⣚⡖⣿⡶⣿⣕⠝⣿⣾⣷⣽⣩⣞⢿⣿⢽⣿⣿⢿⣽⡇⣭⣿⣿⣿⎥
⎢⡹⣏⡊⣾⡻⢤⣛⢿⣇⣿⣽⡎⢾⣯⠶⣿⢿⣾⣿⣿⡺⣿⣻⣓⡟⣞⣟⣯⣿⣽⢵⣱⡿⢻⣓⣼⣣⣯⡾⠏⎥
⎢⠼⣝⡽⡧⢳⣿⣯⣹⣗⣬⣣⣾⣟⣜⣫⣿⣡⣿⣝⢪⢟⣾⣾⣶⣗⣿⣿⣯⡿⣲⢭⢜⣓⣿⣿⡿⣾⣽⢿⡧⎥
⎢⡒⡽⣿⣝⡙⣵⠗⣷⣿⣿⣻⢞⢛⡷⠇⣷⢷⢱⣯⣿⢽⣯⡝⣺⣻⣚⣚⢧⡿⡽⣮⡿⣷⡿⣷⣯⣹⣿⡦⡝⎥
⎢⣿⣗⡛⡾⢽⢮⣿⣿⣲⣿⡻⣮⣿⢾⣶⣿⠺⣸⣷⣻⣿⣷⡿⣻⣻⣿⣮⡥⡾⣿⣿⣼⣾⣶⡿⡿⡿⣾⣷⣷⎥
⎢⡵⣛⣿⣽⣾⣿⣻⣦⣿⡾⣯⣾⣖⡻⡼⣷⡾⣺⣛⢽⡿⣻⣜⠿⢗⣽⣛⢿⣸⣿⣽⣮⣿⣿⡹⣽⣮⡿⣻⡔⎥
⎢⠿⢿⡿⣟⠾⣻⣽⢯⣿⣿⢟⠿⣿⣻⣿⣷⣪⣝⡟⣿⢽⣮⢃⠮⣿⣌⣿⢏⣻⣾⣾⢾⢟⣿⡿⣻⣽⡟⣿⢟⎥
⎢⣻⣿⢟⣿⣺⡦⢽⢷⣯⢹⣽⣽⣿⡾⡿⣿⠻⣿⢷⣽⢿⣿⣷⢛⡳⢿⢿⣷⣟⢻⢾⣑⣛⣵⢽⡿⣳⡃⢻⣟⎥
⎢⣳⡻⢧⡟⣿⡾⡺⠿⣟⣿⠾⣲⣷⢽⣯⡿⣿⣭⡿⣿⡿⢲⣻⣿⣯⢺⡷⣧⣹⣿⣟⣿⠿⠿⢿⣹⣦⣹⣗⣾⎥
⎢⡹⣙⡿⠿⢩⡷⣍⣪⣯⡻⣿⣽⣿⣿⣷⢷⢹⣟⣙⣽⣟⡝⣿⡳⣯⢹⣸⢿⠭⣟⡅⣟⣿⣈⣿⣟⢫⡽⣍⣯⎥
⎢⢯⣿⣍⣻⡻⣶⣷⡯⣛⣞⣞⣿⡯⣷⡽⣯⢿⣭⡹⣿⣝⡦⣿⣿⣿⡬⣽⣿⣟⣭⣝⣾⡿⣿⣿⡿⠬⣭⢢⠯⎥
⎢⣺⣪⣺⣾⣮⡷⡮⡏⣿⣿⣻⣿⣷⠧⡵⣷⣹⡮⣷⣝⣾⢿⢷⢟⠏⣷⣭⣿⡞⣾⣛⣝⣩⣾⢦⣽⣿⠿⣼⡯⎥
⎢⣋⣟⣚⣾⣛⣿⣿⣯⣿⣺⣯⢿⣿⢿⣽⠽⣿⣟⣿⣺⣿⡽⢷⢿⣿⣽⣿⢟⣿⣽⠿⠛⣸⢿⡯⢽⣟⢛⣿⡻⎥
⎣⠸⠯⡿⠽⡖⠏⠿⢿⢗⣿⣗⡳⡷⠷⡿⡿⠯⡟⢯⡾⢯⢫⡵⢟⣞⠷⡯⡿⣛⠿⠼⠿⢽⡿⡣⠾⠯⡾⣽⣍⎦
rr_sum = rr .+ rr2
1000000×1000000 SparseMatrixCSC{Float64, Int64} with 109920 stored entries:
⎡⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎤
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎢⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎥
⎣⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⎦
using BandedMatrices
BandedMatrix(Ones(7,7), (0,0))
7×7 BandedMatrix{Float64} with bandwidths (0, 0):
 1.0   ⋅    ⋅    ⋅    ⋅    ⋅    ⋅ 
  ⋅   1.0   ⋅    ⋅    ⋅    ⋅    ⋅ 
  ⋅    ⋅   1.0   ⋅    ⋅    ⋅    ⋅ 
  ⋅    ⋅    ⋅   1.0   ⋅    ⋅    ⋅ 
  ⋅    ⋅    ⋅    ⋅   1.0   ⋅    ⋅ 
  ⋅    ⋅    ⋅    ⋅    ⋅   1.0   ⋅ 
  ⋅    ⋅    ⋅    ⋅    ⋅    ⋅   1.0
a = BandedMatrix(Eye(7), (1,1))
7×7 BandedMatrix{Float64} with bandwidths (1, 1):
 1.0  0.0   ⋅    ⋅    ⋅    ⋅    ⋅ 
 0.0  1.0  0.0   ⋅    ⋅    ⋅    ⋅ 
  ⋅   0.0  1.0  0.0   ⋅    ⋅    ⋅ 
  ⋅    ⋅   0.0  1.0  0.0   ⋅    ⋅ 
  ⋅    ⋅    ⋅   0.0  1.0  0.0   ⋅ 
  ⋅    ⋅    ⋅    ⋅   0.0  1.0  0.0
  ⋅    ⋅    ⋅    ⋅    ⋅   0.0  1.0
BandedMatrix(Zeros(7,7), (1,2))  #BandedMatrix(Zeros(m,n), (l,u))    
7×7 BandedMatrix{Float64} with bandwidths (1, 2):
 0.0  0.0  0.0   ⋅    ⋅    ⋅    ⋅ 
 0.0  0.0  0.0  0.0   ⋅    ⋅    ⋅ 
  ⋅   0.0  0.0  0.0  0.0   ⋅    ⋅ 
  ⋅    ⋅   0.0  0.0  0.0  0.0   ⋅ 
  ⋅    ⋅    ⋅   0.0  0.0  0.0  0.0
  ⋅    ⋅    ⋅    ⋅   0.0  0.0  0.0
  ⋅    ⋅    ⋅    ⋅    ⋅   0.0  0.0
b = BandedMatrix((0=>1:7,-1=>8:14), (7,7), (1,0))  
#BandedMatrix(bands_dict, (n,m), (l,u))   
7×7 BandedMatrix{Int64} with bandwidths (1, 0):
 1  ⋅   ⋅   ⋅   ⋅   ⋅  ⋅
 8  2   ⋅   ⋅   ⋅   ⋅  ⋅
 ⋅  9   3   ⋅   ⋅   ⋅  ⋅
 ⋅  ⋅  10   4   ⋅   ⋅  ⋅
 ⋅  ⋅   ⋅  11   5   ⋅  ⋅
 ⋅  ⋅   ⋅   ⋅  12   6  ⋅
 ⋅  ⋅   ⋅   ⋅   ⋅  13  7
c = a+b
7×7 BandedMatrix{Float64} with bandwidths (1, 1):
 2.0  0.0    ⋅     ⋅     ⋅     ⋅    ⋅ 
 8.0  3.0   0.0    ⋅     ⋅     ⋅    ⋅ 
  ⋅   9.0   4.0   0.0    ⋅     ⋅    ⋅ 
  ⋅    ⋅   10.0   5.0   0.0    ⋅    ⋅ 
  ⋅    ⋅     ⋅   11.0   6.0   0.0   ⋅ 
  ⋅    ⋅     ⋅     ⋅   12.0   7.0  0.0
  ⋅    ⋅     ⋅     ⋅     ⋅   13.0  8.0
c[5,5] = 99
c[6,5] = 999
c
7×7 BandedMatrix{Float64} with bandwidths (1, 1):
 2.0  0.0    ⋅     ⋅      ⋅     ⋅    ⋅ 
 8.0  3.0   0.0    ⋅      ⋅     ⋅    ⋅ 
  ⋅   9.0   4.0   0.0     ⋅     ⋅    ⋅ 
  ⋅    ⋅   10.0   5.0    0.0    ⋅    ⋅ 
  ⋅    ⋅     ⋅   11.0   99.0   0.0   ⋅ 
  ⋅    ⋅     ⋅     ⋅   999.0   7.0  0.0
  ⋅    ⋅     ⋅     ⋅      ⋅   13.0  8.0
A = brand(10000,10000,4,3)
10000×10000 BandedMatrix{Float64} with bandwidths (4, 3):
 0.717031  0.00970876  0.36922   0.676207   …   ⋅         ⋅         ⋅ 
 0.524178  0.458799    0.769404  0.0453147      ⋅         ⋅         ⋅ 
 0.64484   0.24969     0.876344  0.0749038      ⋅         ⋅         ⋅ 
 0.898134  0.544399    0.505147  0.468312       ⋅         ⋅         ⋅ 
 0.447232  0.937858    0.918129  0.584957       ⋅         ⋅         ⋅ 
  ⋅        0.472211    0.725038  0.169458   …   ⋅         ⋅         ⋅ 
  ⋅         ⋅          0.875969  0.86004        ⋅         ⋅         ⋅ 
  ⋅         ⋅           ⋅        0.413408       ⋅         ⋅         ⋅ 
  ⋅         ⋅           ⋅         ⋅             ⋅         ⋅         ⋅ 
  ⋅         ⋅           ⋅         ⋅             ⋅         ⋅         ⋅ 
  ⋅         ⋅           ⋅         ⋅         …   ⋅         ⋅         ⋅ 
  ⋅         ⋅           ⋅         ⋅             ⋅         ⋅         ⋅ 
  ⋅         ⋅           ⋅         ⋅             ⋅         ⋅         ⋅ 
 ⋮                                          ⋱                      
  ⋅         ⋅           ⋅         ⋅             ⋅         ⋅         ⋅ 
  ⋅         ⋅           ⋅         ⋅             ⋅         ⋅         ⋅ 
  ⋅         ⋅           ⋅         ⋅         …   ⋅         ⋅         ⋅ 
  ⋅         ⋅           ⋅         ⋅             ⋅         ⋅         ⋅ 
  ⋅         ⋅           ⋅         ⋅             ⋅         ⋅         ⋅ 
  ⋅         ⋅           ⋅         ⋅             ⋅         ⋅         ⋅ 
  ⋅         ⋅           ⋅         ⋅            0.601123   ⋅         ⋅ 
  ⋅         ⋅           ⋅         ⋅         …  0.845169  0.991419   ⋅ 
  ⋅         ⋅           ⋅         ⋅            0.811976  0.670674  0.474065
  ⋅         ⋅           ⋅         ⋅            0.318209  0.641995  0.94989
  ⋅         ⋅           ⋅         ⋅            0.22135   0.198353  0.664046
  ⋅         ⋅           ⋅         ⋅            0.544739  0.733808  0.51824
b = randn(10000)
10000-element Vector{Float64}:
 -0.8946805178374083
  1.791456844344719
 -1.006332232770576
  0.791271927711156
 -0.94388977856925
 -0.5040404487535977
  1.1438942503322178
  0.9156317592307124
 -0.35552174058202674
 -0.6318695191330024
  1.5025208196461846
 -1.4482466971059618
  0.17640586952859919
  ⋮
 -0.3754881597565462
 -0.51204988872086
 -0.23009357169989553
  0.9263014134731414
  1.1145981384963324
 -0.18044177324813834
  0.5248468744312341
  0.17633316812053937
  1.8671731928639776
 -0.9640139843966083
 -0.9608403570552534
 -2.0782487798309455
A*b  #   Calls optimized matrix*vector routine
10000-element Vector{Float64}:
 -0.46061597769868723
 -0.9265070327469163
 -2.013311452629498
 -0.7506352359068451
 -0.06983968834709088
  1.018277053750206
 -0.8251580030395005
  1.8107551270275635
 -1.5123515240377137
 -0.5287396651053948
 -0.8581073378399161
 -0.7516150550394638
 -1.2631999408486454
  ⋮
  0.1842696379906157
  1.5976905287134318
  1.0873361557626304
  1.33002721957883
  1.9513355177555978
  2.3502263751233157
  0.7054007981349245
 -0.1342826431395634
 -1.290661336050847
 -1.3111973496087521
  0.06480040232269113
 -1.388614621665221
A*A  #   Calls optimized matrix*matrix routine
10000×10000 BandedMatrix{Float64} with bandwidths (8, 6):
 1.36464  0.471733  0.937361  0.829633  …   ⋅         ⋅         ⋅ 
 1.40954  0.969943  1.76996   0.789391      ⋅         ⋅         ⋅ 
 1.58287  1.41727   2.41116   1.11858       ⋅         ⋅         ⋅ 
 1.98477  1.56936   2.6256    1.52106       ⋅         ⋅         ⋅ 
 2.24524  1.907     3.07939   1.23695       ⋅         ⋅         ⋅ 
 1.00182  1.04811   2.15919   1.19748   …   ⋅         ⋅         ⋅ 
 1.52705  1.3049    2.08748   1.12917       ⋅         ⋅         ⋅ 
 0.69866  1.18208   1.87358   1.61458       ⋅         ⋅         ⋅ 
 0.44223  1.29412   2.13079   1.59959       ⋅         ⋅         ⋅ 
  ⋅       0.175528  0.401125  0.228377      ⋅         ⋅         ⋅ 
  ⋅        ⋅        0.312613  0.478744  …   ⋅         ⋅         ⋅ 
  ⋅        ⋅         ⋅        0.267922      ⋅         ⋅         ⋅ 
  ⋅        ⋅         ⋅         ⋅            ⋅         ⋅         ⋅ 
 ⋮                                      ⋱                      
  ⋅        ⋅         ⋅         ⋅            ⋅         ⋅         ⋅ 
  ⋅        ⋅         ⋅         ⋅            ⋅         ⋅         ⋅ 
  ⋅        ⋅         ⋅         ⋅        …   ⋅         ⋅         ⋅ 
  ⋅        ⋅         ⋅         ⋅           0.579612   ⋅         ⋅ 
  ⋅        ⋅         ⋅         ⋅           1.24427   0.775432   ⋅ 
  ⋅        ⋅         ⋅         ⋅           1.09257   0.557801  0.27567
  ⋅        ⋅         ⋅         ⋅           1.26119   1.1039    0.679625
  ⋅        ⋅         ⋅         ⋅        …  1.78903   1.65778   1.57376
  ⋅        ⋅         ⋅         ⋅           1.56548   1.86529   1.48994
  ⋅        ⋅         ⋅         ⋅           1.88381   1.90089   1.6069
  ⋅        ⋅         ⋅         ⋅           1.78658   1.89717   1.11774
  ⋅        ⋅         ⋅         ⋅           1.55716   1.85718   1.47509
@time A\b  #   Calls optimized matrix\vector routine
  2.290472 seconds (2.51 M allocations: 170.883 MiB, 6.95% gc time, 99.57% compilation time)
10000-element Vector{Float64}:
  -9.91674386347239
 -10.444435524208323
 -41.59197593049818
  32.05223575044192
  73.84927967273306
 -27.812181705093835
 -88.75981265563105
  56.52742402129784
  85.06971609320844
 -79.71545999885285
 -77.01728811351599
   9.002008266490867
  59.49522736987263
   ⋮
   4.779827015270882e10
   6.540332502802427e10
  -3.116607411863486e10
   3.1950751793228928e10
  -1.7542419719066235e10
   2.534581982521389e10
  -5.195472316257162e10
  -1.2631042018255177e10
   1.4642478006565704e10
   5.8643172114918945e10
  -3.0952824972707294e10
  -1.2727204977828215e10

Optimization models

using JuMP, HiGHS
m = Model(HiGHS.Optimizer)
@variable(m,  x₁ >= 0)
@variable(m,  x₂ >= 0)
@objective(m, Min, 50x₁ + 70x₂)
@constraint(m, 200x₁ + 2000*x₂ >= 9000)
@constraint(m, 100x₁ +   30x₂ >=  300)
@constraint(m, 9x₁   +   11x₂ >=   60)
println(m)
Min 50 x₁ + 70 x₂
Subject to
 200 x₁ + 2000 x₂ >= 9000
 100 x₁ + 30 x₂ >= 300
 9 x₁ + 11 x₂ >= 60
 x₁ >= 0
 x₂ >= 0
m
A JuMP Model
Minimization problem with:
Variables: 2
Objective function type: AffExpr
`AffExpr`-in-`MathOptInterface.GreaterThan{Float64}`: 3 constraints
`VariableRef`-in-`MathOptInterface.GreaterThan{Float64}`: 2 constraints
Model mode: AUTOMATIC
CachingOptimizer state: EMPTY_OPTIMIZER
Solver name: HiGHS
Names registered in the model: x₁, x₂
optimize!(m)
Running HiGHS 1.5.3 [date: 1970-01-01, git hash: 45a127b78]
Copyright (c) 2023 HiGHS under MIT licence terms
Presolving model
3 rows, 2 cols, 6 nonzeros
3 rows, 2 cols, 6 nonzeros
Presolve : Reductions: rows 3(-0); columns 2(-0); elements 6(-0) - Not reduced
Problem not reduced by presolve: solving the LP
Using EKK dual simplex solver - serial
  Iteration        Objective     Infeasibilities num(sum)
          0     0.0000000000e+00 Pr: 3(2205) 0s
          2     3.8814432990e+02 Pr: 0(0) 0s
Model   status      : Optimal
Simplex   iterations: 2
Objective value     :  3.8814432990e+02
HiGHS run time      :          0.02
value(x₁)
1.7010309278350515
typeof(x₁ + 4x₂)
AffExpr (alias for GenericAffExpr{Float64, GenericVariableRef{Float64}})
4x₁ + 4x₁ + x₂

$ 8 x₁ + x₂ $

value.([x₁,x₂])
2-element Vector{Float64}:
 1.7010309278350515
 4.329896907216495
# Others solvers can be used such as the comercial Gurobi
# using Gurobi
# m = Model(Gurobi.Optimizer)

The same model as above but now the decision variables are expected to be integers

m = Model(HiGHS.Optimizer)
@variable(m, 100 >= x₁ >= 0, Int)
@variable(m, 110 >= x₂ >= 0, Int)
@objective(m, Min, 50x₁ + 70x₂)
@constraint(m, 200x₁ + 2000x₂ >= 9000)
@constraint(m, 100x₁ +   30x₂ >=  300)
@constraint(m, 9x₁   +   11x₂ >=   60)
println(m)
Min 50 x₁ + 70 x₂
Subject to
 200 x₁ + 2000 x₂ >= 9000
 100 x₁ + 30 x₂ >= 300
 9 x₁ + 11 x₂ >= 60
 x₁ >= 0
 x₂ >= 0
 x₁ <= 100
 x₂ <= 110
 x₁ integer
 x₂ integer
optimize!(m)
Running HiGHS 1.5.3 [date: 1970-01-01, git hash: 45a127b78]
Copyright (c) 2023 HiGHS under MIT licence terms
Presolving model
3 rows, 2 cols, 6 nonzeros
3 rows, 2 cols, 6 nonzeros
Objective function is integral with scale 0.1

Solving MIP model with:
   3 rows
   2 cols (0 binary, 2 integer, 0 implied int., 0 continuous)
   6 nonzeros

        Nodes      |    B&B Tree     |            Objective Bounds              |  Dynamic Constraints |       Work      
     Proc. InQueue |  Leaves   Expl. | BestBound       BestSol              Gap |   Cuts   InLp Confl. | LpIters     Time

         0       0         0   0.00%   0               inf                  inf        0      0      0         0     0.0s

Solving report
  Status            Optimal
  Primal bound      450
  Dual bound        450
  Gap               0% (tolerance: 0.01%)
  Solution status   feasible
                    450 (objective)
                    0 (bound viol.)
                    0 (int. viol.)
                    0 (row viol.)
  Timing            0.00 (total)
                    0.00 (presolve)
                    0.00 (postsolve)
  Nodes             1
  LP iterations     2 (total)
                    0 (strong br.)
                    0 (separation)
                    0 (heuristics)
value.([x₁,x₂])
2-element Vector{Float64}:
 2.0
 5.0
#1.9999999999999
round.(Int, value.([x₁,x₂]))
2-element Vector{Int64}:
 2
 5

What are those @ macros we are seeing around?

code = Meta.parse("x=5")
:(x = 5)
dump(code)
Expr
  head: Symbol =
  args: Array{Any}((2,))
    1: Symbol x
    2: Int64 5
:somysymbolname
:somysymbolname
dump(code)
Expr
  head: Symbol =
  args: Array{Any}((2,))
    1: Symbol x
    2: Int64 5
eval(code)
5
x
5
macro sayhello(name)
    println("Code is being generated")
    return :( println("Hello, ", $name) )
end
@sayhello (macro with 1 method)
@sayhello("John")
Code is being generated
Hello, John
@time @sayhello("John")
Code is being generated
Hello, John
  0.000393 seconds (26 allocations: 1024 bytes)
@macroexpand @sayhello("John")
Code is being generated
:(Main.println("Hello, ", "John"))
println(@macroexpand @sayhello("John"))
Code is being generated
Main.println("Hello, ", "John")
@sayhello("John")
Code is being generated
Hello, John
println(@macroexpand @variable(m, x₁ >= 0));
begin
    #= In[89]:1 =#
    JuMP._valid_model(m, :m)
    begin
        #= c:\JuliaPkg\Julia-1.9.3\packages\JuMP\OUdu2\src\macros.jl:128 =#
        JuMP._error_if_cannot_register(m, :x₁)
        #= c:\JuliaPkg\Julia-1.9.3\packages\JuMP\OUdu2\src\macros.jl:136 =#
        var"#117###340" = begin
                #= c:\JuliaPkg\Julia-1.9.3\packages\JuMP\OUdu2\src\macros.jl:1210 =#
                let m = m
                    #= c:\JuliaPkg\Julia-1.9.3\packages\JuMP\OUdu2\src\macros.jl:1211 =#
                    JuMP.add_variable(m, JuMP.model_convert(m, JuMP.build_variable(JuMP.var"#_error#115"{LineNumberNode}(:(#= In[89]:1 =#), Core.Box((:m, :(x₁ >= 0)))), JuMP.VariableInfo(true, 0, false, NaN, false, NaN, false, NaN, false, false))), if JuMP.set_string_names_on_creation(m)
                            "x₁"
                        else
                            ""
                        end)
                end
            end
        #= c:\JuliaPkg\Julia-1.9.3\packages\JuMP\OUdu2\src\macros.jl:137 =#
        m[:x₁] = var"#117###340"
        #= c:\JuliaPkg\Julia-1.9.3\packages\JuMP\OUdu2\src\macros.jl:143 =#
        x₁ = var"#117###340"
    end
end
using Calculus
Meta.parse("(sin(x) + x*x+5x)")
:(sin(x) + x * x + 5x)
using AbstractTrees
AbstractTrees.print_tree(:(sin(x) + x*x+5x))
:(sin(x) + x * x + 5x)
├─ :+
├─ :(sin(x))
│  ├─ :sin
│  └─ :x
├─ :(x * x)
│  ├─ :*
│  ├─ :x
│  └─ :x
└─ :(5x)
   ├─ :*
   ├─ 5
   └─ :x
differentiate(:(sin(x) + x*x+5x))
:(1 * cos(x) + (1x + x * 1) + (0 * x + 5 * 1))

Subway optimization

using JuMP, HiGHS
using DelimitedFiles
S = 18   # number of warehouses
D = 100  # number of restaurants
supply = fill(100,S)
demand = fill(15,D);
dat="""21328    7901    16774   24413   14131   21551   15742   21091   25167   3266    19312   14878   22914   18392   14514   21072   11535   12965   12952   12952   15839   27836   16816   13527   13769   4924    23891   26532   10245   15446   16834   11421   27231   20285   25810   12477   16499   26958   23770   32327   23572   26475   23894   2054    22156   28491   2392    21051   18793   14598   9413    8004    14286   8717    24919   27581   21829   26135   33450   4905    18558   23046   9212    20923   6426    20020   24644   10862   12351   16446   7751    19819   17406   16768   9319    17584   8191    18776   22432   20389   25377   3129    22425   6899    16830   12305   3393    24579   16727   21086   30660   26999   29664   26436   26138   28183   20874   30143   30419   29718
5195    15055   4216    9295    6050    4779    3641    15102   8395    16829   5224    3595    14177   1684    2753    4300    5469    4233    3962    3962    3039    13176   12595   4764    5428    12767   7119    11115   6960    1597    1173    5848    26351   10941   19927   7174    13542   11015   6998    18198   18270   9703    7376    15867   21981   12387   19029   6936    9170    15206   7748    8933    11906   12823   24031   21698   8083    11432   18205   18548   17558   25919   10027   7178    10521   19897   29294   15136   13659   10764   18656   7135    9574    19550   13973   11620   17524   10944   9910    9940    26373   13762   11058   16154   22702   19728   14022   13212   4280    7563    22298   13152   15728   12308   17068   19554   12880   16313   18674   20915
24025   13295   22003   22254   20379   26418   22859   10495   28761   12735   29656   23437   36220   24616   24042   28477   24395   24758   25121   25121   26548   22646   30633   22842   26491   16326   27452   24373   23067   25325   26551   21542   15677   33471   14651   19634   30316   26735   29375   24969   12413   28023   29753   17066   10602   28267   17790   19925   31979   11657   20017   21593   15001   22657   12101   16422   32830   33810   28260   19918   7961    10227   17414   32204   20070   8789    9898    25254   26654   15742   22269   18946   30636   6105    14423   31182   22709   32007   34792   15723   12559   17643   34786   12754   2900    26823   15171   36939   28008   19237   19501   18955   20687   20740   15099   17024   13131   22051   19380   18559
22508   22014   17954   25593   16490   22230   16922   24989   25524   22994   19299   15807   6155    19098   15443   21468   12730   13765   13698   13698   16518   29016   5152    14712   12407   19272   24570   27712   12077   16125   17442   13516   36239   9317    29815   14836   4483    28138   23911   33506   28157   27086   24005   18712   31869   29670   18413   22231   11140   22962   14736   13031   19351   12074   33919   31585   16632   23191   34630   15865   26772   35807   17273   14708   14646   29784   39182   9608    8373    18948   12547   20876   8411    28622   21042   6315    12007   7728    12270   22466   36261   17719   9635    23113   31385   8204    20527   7661    15899   22181   34494   28057   30721   27616   29264   31749   23716   31201   32314   33111
12913   5006    10217   14089   7574    15206   10375   9595    18453   9762    18403   10736   23395   12829   11221   17256   11328   11937   12300   12300   13779   17052   17405   10020   13666   7932    17144   16209   9839    13158   14563   8314    17138   20646   14521   6406    17088   16684   19067   21542   12283   17714   19445   11789   12768   18217   15071   10777   19154   2717    6596    8172    2025    9318    14805   16292   20943   23502   22666   14640   7256    16706   3917    20037   6821    10026   20081   11915   13233   4235    14855   9053    17812   8378    3697    18089   14029   19182   22625   8169    17147   9683    22619   6321    11174   17884   9520    24772   15841   10116   19371   16003   18668   15652   14502   16894   8883    19148   18556   18349
6281    16514   5753    10410   7509    5795    5099    16782   9308    18288   3312    4999    12625   2798    4158    5230    4648    3949    3341    3341    2010    14262   11741   5686    4491    14172   8135    12202   6139    1903    833 7252    28032   9390    21608   8633    12688   12061   7800    19313   19950   10719   7894    15046   23662   13403   18173   8050    7618    16665   9152    8112    13364   11967   25712   23378   6171    10135   19220   17692   19239   27600   11486   5266    9700    21577   30975   14265   12788   12222   17785   8823    8548    21230   15432   10622   16653   9685    7998    11620   28054   12940   9507    17612   24382   18873   15427   11660   3003    8736    23737   14267   16842   13422   18513   21173   14561   17427   19876   22534
4656    19128   9008    4124    11894   5115    10216   15084   4473    23560   7924    11296   21642   8036    12133   5794    14532   13370   13273   13273   12492   5169    21810   12436   14643   21431   3145    2102    16062   11050   9979    14704   26333   18420   19909   13545   22757   1361    5712    10225   18252   3044    6090    24906   21963   2893    28161   5968    16648   16074   16577   18035   13324   21984   24014   21680   11094   10146   8657    27685   17540   25902   17061   12978   19586   19879   29276   24306   22830   12163   27827   7636    18788   19532   17819   20835   26694   20159   15710   9922    26355   22801   18447   20443   22684   28942   22852   20547   13660   6657    21239   9808    12248   7988    16401   19061   12862   10557   16997   20422
16446   6905    14412   14675   11772   18839   14588   7393    21182   11243   22077   14949   27612   17024   15434   20898   15787   16151   16513   16513   17993   16064   22025   14234   17883   9936    19873   16795   14459   17371   18776   12934   14600   24863   12201   11026   21708   19156   21796   20555   9963    20444   22174   13463   10230   20688   16645   12347   23371   2861    11340   12916   6325    14030   12267   13972   25156   26231   21678   16314   4936    12206   8733    24250   11443   6518    15339   16627   17978   7135    16968   11368   22029   3256    5796    22574   17408   23399   26839   8145    14538   11797   26832   8220    6019    21522   11470   28986   20054   11658   17050   15108   17399   14664   12468   14574   6924    18252   16749   16109
12070   11940   10048   10299   8061    14463   10878   2839    16806   16286   17701   11239   23901   12661   11724   16522   12122   12440   12803   12803   14282   11080   18749   10523   14172   14959   15497   12419   11183   13370   14597   9658    14089   21152   7665    7700    18432   14780   17420   15539   6007    16068   17798   18505   9719    16312   21688   7971    19661   5094    10528   12139   5548    16015   11769   9435    20875   21855   16694   21357   5295    13657   10644   20249   13538   7634    17032   18604   17127   3353    21847   6992    18318   7287    10724   18909   20725   19688   22837   3769    14111   16676   22831   13254   10342   24395   16513   24984   16053   7282    12491   9848    12040   9648    7426    9830    1565    12993   11480   11273
10329   23462   13628   6311    16247   10788   14836   14910   7331    27913   13481   15916   27214   13710   16784   11351   19112   17990   17893   17893   18028   2647    26430   16997   19262   25789   8815    4323    20625   16722   15636   19217   26160   23992   19703   17898   27377   4669    9312    5419    18078   5880    9548    29264   21790   5165    32546   9541    22221   17837   20935   22545   17686   26421   23840   21156   16651   13619   3334    32102   17366   25728   21419   18550   23944   19705   29103   28886   27409   15635   32264   11041   23408   19358   22153   25455   31131   24779   21282   11857   26182   27159   24019   24777   22510   33562   27210   26120   19317   9710    19225   7343    9606    5523    14848   17430   12203   7915    14982   18439
9046    7229    6350    10918   3587    11339   6397    7802    14626   11985   14424   6758    19420   8962    7243    13389   7493    7959    8322    8322    9801    14306   13857   6042    9691    10151   13317   13037   6291    9180    10585   4767    19051   16671   12628   2808    13541   13463   15240   18797   10970   13888   15618   13923   14681   14996   17205   7556    15180   6200    5636    7247    2636    11123   16732   14398   16964   19675   19921   16774   9996    18620   5807    16059   8646    12597   21994   13712   12236   1761    16966   5969    13837   11847   5920    14279   15834   15208   18647   5814    19073   11817   18641   8544    14609   19504   11740   20794   11863   7254    17377   13258   15923   12907   12147   14632   6528    16403   16201   15994
4657    14007   4438    3771    6792    7050    5646    9018    9393    18458   10288   6726    19802   5879    7594    9109    9647    8800    8703    8703    8838    7010    17078   7532    10073   16334   8084    5890    11170   7533    8037    9762    20267   16877   13843   8443    18024   7499    10007   11501   12186   8655    10385   19810   15897   9032    23092   475 15105   10063   11480   13090   8130    16966   17948   15614   13462   14442   12625   22647   11474   19836   11964   13509   14489   13813   23210   19421   17945   6236    22809   1641    14219   13466   12699   15994   21677   15589   16241   3856    20289   17704   16993   15322   16618   24210   17755   19147   10215   239 15528   6455    8962    5611    10304   12964   6796    9616    11742   14325
16168   26420   17376   13468   19205   18438   18584   14330   20479   30871   21676   19664   32739   18651   20532   20497   22329   21738   21641   21641   21776   11708   29491   20520   23011   28747   19444   14693   23583   20470   20809   22175   13538   29815   9415    20856   30437   17221   21396   8872    12092   19047   21773   32223   13988   18752   35504   13302   28043   20840   23893   25503   20740   29379   14349   9468    24851   25830   16745   35060   16582   17173   24377   25029   26902   15501   19174   31968   30491   18559   35222   13999   27156   18554   25111   28487   34089   28527   27761   14815   16400   30117   29931   27735   21545   36623   30168   32085   23153   13264   6197    7925    4260    9841    9124    7119    14589   5954    5049    5752
3485    11419   1299    6632    4185    5779    2507    10988   9414    15851   9008    3586    16662   3402    4455    7829    6823    5661    5564    5564    5416    10055   14101   4742    6933    13733   8106    8751    8353    4111    5337    6995    22238   13455   15814   5836    15048   8923    10029   14546   14156   8763    10407   17212   17868   10455   20452   3270    11683   11254   8883    10326   7960    14275   19918   17585   11895   14463   15669   19976   13445   21806   9362    10990   11892   15784   25181   16597   15121   6811    20117   2917    11079   15437   10110   13126   18985   12450   13578   5827    22260   15107   13572   12733   18588   21233   15158   15725   6794    3345    18185   9500    12007   8655    12955   15440   8767    12661   14455   16802
20790   9466    18768   19019   16388   23183   19204   8048    25526   11781   26421   19566   32228   21381   20050   25242   20403   20767   21130   21130   22609   19411   26641   18850   22499   12497   24217   21138   19075   21988   23316   17550   14121   29479   12701   15643   26324   23500   26140   22523   10462   24788   26518   14435   9046    25032   17184   16690   27988   7647    16008   17583   10992   18647   11810   14471   29595   30575   25025   17287   5515    9936    13405   28867   16060   6910    13069   21244   22645   11751   19529   15711   26645   3659    10414   27190   19969   28015   31455   12488   12268   14358   31448   9386    3721    24083   12443   33602   24671   16002   17550   16509   18607   17505   13028   15073   9896    19605   17309   16608
5209    18274   7514    9447    9270    4512    6860    18543   7991    20049   1770    6760    12619   4480    5919    3913    6409    5710    5102    5102    3771    13094   13449   7447    6252    15933   6849    11034   7900    3664    2533    9013    29793   9396    23369   10394   14396   10775   6403    19042   21711   9433    6497    16807   25423   12117   19933   9297    7625    18426   10913   9872    15125   13727   27473   25139   4829    8593    17935   19453   20999   29361   13247   3955    11460   23338   32736   15967   14491   13983   19488   10504   10133   22991   17193   12207   18355   11270   6687    13381   29815   14701   9423    19373   26143   20582   17188   11524   4764    9986    25177   14146   17374   13177   20016   22675   16322   17157   20935   24036
11294   12965   9272    9523    8056    13686   10128   2354    16029   17305   16925   10705   23655   11884   11574   15745   12558   12435   12683   12683   13816   9914    19729   10518   13926   15996   14721   11642   12295   12593   13820   10771   13604   20906   7180    8812    19545   14003   16644   14373   5522    15291   17022   19525   9234    15536   22708   7194    19415   6113    11640   13251   6661    17127   11284   8951    20099   21078   15528   22377   4811    13172   11756   19472   14650   7150    16547   19716   18239   4465    22960   6215    18072   6803    11837   18726   21838   19442   22060   2992    13626   17788   22054   14280   9954    25508   17533   24208   15276   6505    11490   8682    10874   8483    6260    8746    400 11827   10314   10107
5303    9560    2607    8328    2468    7596    3463    9535    11232   14134   10826   4040    16990   5219    4909    9646    5902    6115    6018    6018    7151    11553   13064   4295    7261    12016   9923    10447   6988    5928    7155    5580    20785   14241   14361   4119    14011   10873   11846   16044   12703   10580   12224   15533   16415   12406   18815   4966    12750   9395    7204    8813    6101    12689   18465   16131   13713   16281   17167   18371   11992   20353   7646    12807   10213   14330   23728   15278   13802   4953    18532   3288    11407   13983   8252    12061   17400   12778   15396   4373    20807   13428   15389   10875   17135   20196   13479   17543   8611    4601    16731   10469   13133   10153   11502   13987   7314    13613   14444   15348
""";
c = readdlm(IOBuffer(dat), Int)
18×100 Matrix{Int64}:
 21328   7901  16774  24413  14131  …  28183  20874  30143  30419  29718
  5195  15055   4216   9295   6050     19554  12880  16313  18674  20915
 24025  13295  22003  22254  20379     17024  13131  22051  19380  18559
 22508  22014  17954  25593  16490     31749  23716  31201  32314  33111
 12913   5006  10217  14089   7574     16894   8883  19148  18556  18349
  6281  16514   5753  10410   7509  …  21173  14561  17427  19876  22534
  4656  19128   9008   4124  11894     19061  12862  10557  16997  20422
 16446   6905  14412  14675  11772     14574   6924  18252  16749  16109
 12070  11940  10048  10299   8061      9830   1565  12993  11480  11273
 10329  23462  13628   6311  16247     17430  12203   7915  14982  18439
  9046   7229   6350  10918   3587  …  14632   6528  16403  16201  15994
  4657  14007   4438   3771   6792     12964   6796   9616  11742  14325
 16168  26420  17376  13468  19205      7119  14589   5954   5049   5752
  3485  11419   1299   6632   4185     15440   8767  12661  14455  16802
 20790   9466  18768  19019  16388     15073   9896  19605  17309  16608
  5209  18274   7514   9447   9270  …  22675  16322  17157  20935  24036
 11294  12965   9272   9523   8056      8746    400  11827  10314  10107
  5303   9560   2607   8328   2468     13987   7314  13613  14444  15348
m = Model(HiGHS.Optimizer);
@variable(m, 400 >= x[i=1:S, j=1:D] >= 0, Int)
18×100 Matrix{VariableRef}:
 x[1,1]   x[1,2]   x[1,3]   x[1,4]   …  x[1,98]   x[1,99]   x[1,100]
 x[2,1]   x[2,2]   x[2,3]   x[2,4]      x[2,98]   x[2,99]   x[2,100]
 x[3,1]   x[3,2]   x[3,3]   x[3,4]      x[3,98]   x[3,99]   x[3,100]
 x[4,1]   x[4,2]   x[4,3]   x[4,4]      x[4,98]   x[4,99]   x[4,100]
 x[5,1]   x[5,2]   x[5,3]   x[5,4]      x[5,98]   x[5,99]   x[5,100]
 x[6,1]   x[6,2]   x[6,3]   x[6,4]   …  x[6,98]   x[6,99]   x[6,100]
 x[7,1]   x[7,2]   x[7,3]   x[7,4]      x[7,98]   x[7,99]   x[7,100]
 x[8,1]   x[8,2]   x[8,3]   x[8,4]      x[8,98]   x[8,99]   x[8,100]
 x[9,1]   x[9,2]   x[9,3]   x[9,4]      x[9,98]   x[9,99]   x[9,100]
 x[10,1]  x[10,2]  x[10,3]  x[10,4]     x[10,98]  x[10,99]  x[10,100]
 x[11,1]  x[11,2]  x[11,3]  x[11,4]  …  x[11,98]  x[11,99]  x[11,100]
 x[12,1]  x[12,2]  x[12,3]  x[12,4]     x[12,98]  x[12,99]  x[12,100]
 x[13,1]  x[13,2]  x[13,3]  x[13,4]     x[13,98]  x[13,99]  x[13,100]
 x[14,1]  x[14,2]  x[14,3]  x[14,4]     x[14,98]  x[14,99]  x[14,100]
 x[15,1]  x[15,2]  x[15,3]  x[15,4]     x[15,98]  x[15,99]  x[15,100]
 x[16,1]  x[16,2]  x[16,3]  x[16,4]  …  x[16,98]  x[16,99]  x[16,100]
 x[17,1]  x[17,2]  x[17,3]  x[17,4]     x[17,98]  x[17,99]  x[17,100]
 x[18,1]  x[18,2]  x[18,3]  x[18,4]     x[18,98]  x[18,99]  x[18,100]
= sum
sum (generic function with 17 methods)
for j=1:D
   @constraint(m, (x[:, j]) >= demand[j]   )
end
for i=1:S
   @constraint(m, (x[i, :]) <= supply[i]   )
end
#@objective(m, Min, ∑( x[i, j]*c[i, j] for i=1:S, j=1:D))
@objective(m, Min, ( x .* c ));
optimize!(m)
Running HiGHS 1.5.3 [date: 1970-01-01, git hash: 45a127b78]
Copyright (c) 2023 HiGHS under MIT licence terms
Presolving model
118 rows, 1800 cols, 3600 nonzeros
118 rows, 1800 cols, 3600 nonzeros
Objective function is integral with scale 1

Solving MIP model with:
   118 rows
   1800 cols (0 binary, 1800 integer, 0 implied int., 0 continuous)
   3600 nonzeros

        Nodes      |    B&B Tree     |            Objective Bounds              |  Dynamic Constraints |       Work      
     Proc. InQueue |  Leaves   Expl. | BestBound       BestSol              Gap |   Cuts   InLp Confl. | LpIters     Time

         0       0         0   0.00%   0               inf                  inf        0      0      0         0     0.0s
 T       0       0         0   0.00%   0               7994805          100.00%        0      0      0       144     0.0s

Solving report
  Status            Optimal
  Primal bound      7994805
  Dual bound        7994805
  Gap               0% (tolerance: 0.01%)
  Solution status   feasible
                    7994805 (objective)
                    0 (bound viol.)
                    0 (int. viol.)
                    0 (row viol.)
  Timing            0.03 (total)
                    0.02 (presolve)
                    0.00 (postsolve)
  Nodes             1
  LP iterations     144 (total)
                    0 (strong br.)
                    0 (separation)
                    0 (heuristics)
@show termination_status(m)
termination_status(m) = MathOptInterface.OPTIMAL
OPTIMAL::TerminationStatusCode = 1
typeof(termination_status(m))
Enum MathOptInterface.TerminationStatusCode:
OPTIMIZE_NOT_CALLED = 0
OPTIMAL = 1
INFEASIBLE = 2
DUAL_INFEASIBLE = 3
LOCALLY_SOLVED = 4
LOCALLY_INFEASIBLE = 5
INFEASIBLE_OR_UNBOUNDED = 6
ALMOST_OPTIMAL = 7
ALMOST_INFEASIBLE = 8
ALMOST_DUAL_INFEASIBLE = 9
ALMOST_LOCALLY_SOLVED = 10
ITERATION_LIMIT = 11
TIME_LIMIT = 12
NODE_LIMIT = 13
SOLUTION_LIMIT = 14
MEMORY_LIMIT = 15
OBJECTIVE_LIMIT = 16
NORM_LIMIT = 17
OTHER_LIMIT = 18
SLOW_PROGRESS = 19
NUMERICAL_ERROR = 20
INVALID_MODEL = 21
INVALID_OPTION = 22
INTERRUPTED = 23
OTHER_ERROR = 24

value.(x)
18×100 Matrix{Float64}:
  0.0   0.0   0.0   0.0   0.0   0.0  …   0.0   0.0   0.0   0.0   0.0   0.0
  0.0   0.0   0.0   0.0   0.0   0.0      0.0   0.0   0.0   0.0   0.0   0.0
  0.0   0.0   0.0   0.0   0.0   0.0      0.0   0.0   0.0   0.0   0.0   0.0
  0.0   0.0   0.0   0.0   0.0   0.0      0.0   0.0   0.0   0.0   0.0   0.0
  0.0   0.0   0.0   0.0   0.0   0.0      0.0   0.0   0.0   0.0   0.0   0.0
  0.0   0.0   0.0   0.0   0.0   0.0  …   0.0   0.0   0.0   0.0   0.0   0.0
  0.0   0.0   0.0   0.0   0.0   0.0      0.0   0.0   0.0   0.0   0.0   0.0
  0.0  15.0   0.0   0.0   0.0   0.0      0.0   0.0   0.0   0.0   0.0   0.0
  0.0   0.0   0.0   0.0   0.0   0.0      0.0   0.0   0.0   0.0   0.0   0.0
  0.0   0.0   0.0   0.0   0.0   0.0      0.0   0.0   0.0   0.0   0.0   0.0
  0.0   0.0   0.0   0.0   0.0   0.0  …   0.0   0.0   0.0   0.0   0.0   0.0
  0.0   0.0   0.0  15.0   0.0   0.0      0.0   0.0   0.0   0.0   0.0   0.0
  0.0   0.0   0.0   0.0   0.0   0.0      0.0  15.0   0.0  15.0  15.0  15.0
 15.0   0.0  15.0   0.0   0.0  15.0      0.0   0.0   0.0   0.0   0.0   0.0
  0.0   0.0   0.0   0.0   0.0   0.0      0.0   0.0   0.0   0.0   0.0   0.0
  0.0   0.0   0.0   0.0   0.0   0.0  …   0.0   0.0   0.0   0.0   0.0   0.0
  0.0   0.0   0.0   0.0   0.0   0.0     15.0   0.0  15.0   0.0   0.0   0.0
  0.0   0.0   0.0   0.0  15.0   0.0      0.0   0.0   0.0   0.0   0.0   0.0
using SparseArrays
sparse(value.(x))
18×100 SparseMatrixCSC{Float64, Int64} with 109 stored entries:
⎡⠀⠀⠀⠈⠀⠖⠂⠒⠀⠄⡀⠐⠂⠀⠠⠀⠀⠁⠈⠀⠀⡀⠀⠈⢀⢀⢠⠄⠁⢀⢄⠀⠈⠠⠬⠀⠀⠀⠀⠀⎤
⎢⠐⠀⠀⠒⠀⠀⠀⠈⢁⠈⠒⡀⠀⠡⠀⢂⠐⠂⡂⠑⠀⠀⠄⡀⠂⠀⠂⠀⠈⠂⠈⠀⠀⠂⠀⠁⠀⡀⠀⠀⎥
⎢⠄⠅⠤⠀⡠⠀⠀⠄⠀⠀⠀⠁⠈⡂⠁⠀⠀⢀⠀⠁⠉⠀⢀⠀⢀⡀⠀⠉⠈⠀⠀⡀⣀⠀⠀⠈⠚⠀⠒⠒⎥
⎣⠀⠐⠀⠁⠀⠀⠐⠀⠀⠂⠀⠂⠀⠉⠀⠀⠁⠀⠀⠀⠀⠈⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠀⠀⠀⠀⠀⠈⠈⠀⎦
round.(Int, value.(x))  value.(x)
true
objective_value(m)
7.994805e6

Travelling salesman problem (TSP)

dist_mx = [0    17015   15303   20376   12648   16060   25600   14217   19545   30228   14726   21078   33787   18041   14937   17542   20542   16535   21328   7901    16774   24413   14131   21551   15742   21091   25167   3266    19312   14878   22914   18392   14514   21072   11535   12965   12952   12952   15839   27836;
16837   0   25356   17812   13534   1912    9654    17639   13597   15441   9377    7410    19851   4218    22311   3454    12747   6168    5195    15055   4216    9295    6050    4779    3641    15102   8395    16829   5224    3595    14177   1684    2753    4300    5469    4233    3962    3962    3039    13176;
15660   25640   0   34363   13497   27290   25638   8902    13080   25038   17262   19451   23966   21412   6475    28799   12799   19902   24025   13295   22003   22254   20379   26418   22859   10495   28761   12735   29656   23437   36220   24616   24042   28477   24395   24758   25121   25121   26548   22646;
20378   17694   34039   0   20952   16668   26779   25408   22341   31408   17228   22258   34845   19221   30215   17941   23384   18694   22508   22014   17954   25593   16490   22230   16922   24989   25524   22994   19299   15807   6155    19098   15443   21468   12730   13765   13698   13698   16518   29016;
12758   13651   13828   21135   0   15151   15588   5176    7509    19444   4278    10350   22791   9625    10004   16632   8551    7771    12913   5006    10217   14089   7574    15206   10375   9595    18453   9762    18403   10736   23395   12829   11221   17256   11328   11937   12300   12300   13779   17052;
16015   1958    27036   16957   14992   0   10700   19097   15278   16499   10835   8524    20966   5921    23905   1542    14428   7849    6281    16514   5753    10410   7509    5795    5099    16782   9308    18288   3312    4999    12625   2798    4158    5230    4648    3949    3341    3341    2010    14262;
25876   9612    25338   27027   15004   10657   0   17917   13580   5809    12445   6443    16347   7810    22293   9558    12729   9769    4656    19128   9008    4124    11894   5115    10216   15084   4473    23560   7924    11296   21642   8036    12133   5794    14532   13370   13273   13273   12492   5169;
14417   17865   8673    25755   4815    19364   18059   0   5596    18456   8654    11873   21416   13820   4849    20846   6592    11966   16446   6905    14412   14675   11772   18839   14588   7393    21182   11243   22077   14949   27612   17024   15434   20898   15787   16151   16513   16513   17993   16064;
19460   13685   12996   22480   7089    15336   13683   5712    0   13472   5106    7497    16147   9457    9192    16844   1234    7947    12070   11940   10048   10299   8061    14463   10878   2839    16806   16286   17701   11239   23901   12661   11724   16522   12122   12440   12803   12803   14282   11080;
30234   15269   25164   31647   19366   16315   5673    18651   13668   0   16883   9759    13705   12809   22119   15216   12556   14103   10329   23462   13628   6311    16247   10788   14836   14910   7331    27913   13481   15916   27214   13710   16784   11351   19112   17990   17893   17893   18028   2647;
14892   9673    17263   17588   4258    11172   12366   8632    5154    16699   0   7384    20046   5758    13440   12654   6196    3904    9046    7229    6350    10918   3587    11339   6397    7802    14626   11985   14424   6758    19420   8962    7243    13389   7493    7959    8322    8322    9801    14306;
20779   7534    19272   22294   9811    8776    6402    11851   7513    9403    7336    0   13086   3620    16227   9926    6663    4648    4657    14007   4438    3771    6792    7050    5646    9018    9393    18458   10288   6726    19802   5879    7594    9109    9647    8800    8703    8703    8838    7010;
33192   20306   24017   34707   22295   21548   16347   21600   16054   13630   19961   13477   0   16558   22049   21445   14942   17061   16168   26420   17376   13468   19205   18438   18584   14330   20479   30871   21676   19664   32739   18651   20532   20497   22329   21738   21641   21641   21776   11708;
18182   4426    21242   19317   9588    6076    7562    13686   9484    12447   5453    3297    16130   0   18198   7585    8634    2059    3485    11419   1299    6632    4185    5779    2507    10988   9414    15851   9008    3586    16662   3402    4455    7829    6823    5661    5564    5564    5416    10055;
14987   22405   6375    30372   9487    23980   22403   4893    9113    21803   13270   16216   21976   18177   0   25462   9564    16582   20790   9466    18768   19019   16388   23183   19204   8048    25526   11781   26421   19566   32228   21381   20050   25242   20403   20767   21130   21130   22609   19411;
17776   3658    28797   18666   16753   1761    9415    20858   17039   15214   12596   9773    21497   7682    25666   0   16189   9610    5209    18274   7514    9447    9270    4512    6860    18543   7991    20049   1770    6760    12619   4480    5919    3913    6409    5710    5102    5102    3771    13094;
20480   12908   12608   23592   8201    14559   12906   6659    1112    12306   6218    6720    14981   8680    9564    16068   0   7171    11294   12965   9272    9523    8056    13686   10128   2354    16029   17305   16925   10705   23655   11884   11574   15745   12558   12435   12683   12683   13816   9914;
16503   6243    19789   18281   7729    7894    9574    11828   8031    13945   3594    4731    17257   2015    16635   9403    7181    0   5303    9560    2607    8328    2468    7596    3463    9535    11232   14134   10826   4040    16990   5219    4909    9646    5902    6115    6018    6018    7151    11553;
21599   5356    23780   22734   12944   6512    4559    16359   12022   10347   8870    4592    16288   3518    20735   5531    11172   5476    0   14836   4716    4238    7602    2790    5924    13526   6105    19268   6020    7003    17779   3751    7819    4840    10240   9078    8981    8981    8209    8081;
7857    15112   13704   22169   5184    16611   19177   6973    12160   23548   7225    14334   26860   11619   9791    18093   13156   9764    14906   0   12210   17931   9181    17075   11836   14080   20835   4820    19863   12197   24516   14669   12682   19125   12572   13398   13761   13761   15240   21156;
16895   4392    21878   18019   10219   5892    8860    14318   10120   13488   6084    4338    17171   1301    18833   7374    9270    2691    4588    12050   0   7673    4076    5894    1208    11624   9813    14906   9124    2288    15363   3488    3156    7944    5524    4362    4265    4265    5193    11096;
24348   9392    22086   25619   13415   10603   4086    14665   10327   6404    10856   3801    13508   6782    19041   9622    9477    8252    4345    17612   7601    0   10396   5506    8809    11832   7814    22062   8930    9888    21869   7737    10757   7533    13085   11963   11866   11866   12000   4056;
14132   6093    20346   16695   7583    7592    12111   11714   8191    16482   3384    7267    19793   4552    16522   9074    8119    2537    7839    9201    3822    10865   0   8055    2817    10474   11975   11763   10844   3178    15840   5650    3662    10106   4502    4379    4742    4742    6221    14090;
21436   4723    26289   22411   15307   5769    5072    18868   14531   10881   11172   7101    18750   5829    23244   4673    13681   7779    2740    17138   5827    5517    8192    0   6327    16035   3919    19343   3552    7406    16757   3376    7353    2373    10069   8833    8561    8561    7639    8817;
15687   3789    22924   16811   10402   5288    9892    14507   10983   14520   6245    5370    18203   2333    19315   6770    10316   3737    5620    11923   1066    8705    2918    6156    0   12670   10075   13698   8540    1080    14155   3750    1948    7846    4316    3154    3057    3057    3985    12128;
21045   15226   10254   25146   9186    16877   15224   7459    2666    14624   7772    9038    14277   10998   7984    18386   2386    9488    13611   13852   11589   11840   10374   16004   12446   0   18347   17870   19242   13023   25973   14202   13891   18063   14789   14753   15000   15000   16134   12232;
25093   8380    28419   25988   17792   9365    4449    20998   16661   7422    14499   9231    20447   9297    25375   8231    15811   11255   6033    20614   9633    7647    11998   3972    10133   18165   0   23149   6373    11212   20107   7182    11009   4244    13725   12489   12218   12218   11295   9268;
3269    17087   12447   23091   9425    18535   23676   10994   16323   28047   11778   18833   31359   16118   11733   20016   17319   14102   19392   4764    14895   22430   11699   19128   13889   17868   23048   0   21787   14225   25619   16722   14491   21144   14247   15207   15570   15570   17049   25655;
19370   5218    29555   19904   18347   3354    7939    22134   17796   13628   14190   10366   22016   8979    26510   2032    16946   10929   5890    19868   8977    8967    10863   3548    8454    19301   6286    21642   0   8354    13849   6031    7512    2230    8002    7303    6695    6695    5364    12267;
15123   3742    23129   16247   10398   5189    10973   14503   10979   15601   6241    6451    19284   3414    19310   6671    10520   3928    6701    11919   2147    9786    2914    7237    1115    12875   11156   13694   8441    0   13591   4046    1289    7798    3752    2590    2493    2493    3730    13209;
23066   12913   36235   6148    23328   11248   21677   27604   24080   27384   19342   18864   31697   15567   32411   12509   23813   17221   17470   24390   15398   21598   16015   16775   14408   26168   20042   25682   13816   13293   0   13987   12929   15985   12394   11789   11185   11185   11655   25357;
18306   1655    24425   19281   12816   2898    8020    16914   12666   13807   8681    5726    18168   3338    21380   4425    11816   5288    3560    14647   3336    7611    5684    3220    3721    14171   7140    16834   6173    4291    15523   0   4102    5029    6939    5703    5431    5431    4509    11541;
14792   2858    23883   15916   11147   4305    11899   15252   11728   16527   6990    7377    20210   4340    20060   5787    11446   4854    7592    12668   3073    10712   3664    7283    2041    13801   11010   14409   7557    926 13261   4086    0   6915    3422    2259    2162    2162    2847    14135;
20940   4227    28327   21915   17230   5179    5981    20906   16569   11732   13095   9139    20788   7752    25283   4045    15719   9701    4662    19061   7750    7739    10115   2321    7744    18073   4390    20933   2200    7698    15933   4990    6857    0   9572    8336    8065    8065    7142    11040;
11410   5695    24148   12807   11182   4838    14370   15516   12038   18999   7130    9803    22682   6812    20324   6320    12537   6289    10099   12463   5545    13184   4544    10230   4538    14686   13846   14019   8090    3645    12463   7098    3281    9751    0   1743    1730    1730    4518    16606;
13077   4452    24715   14052   11979   4191    13470   16084   12560   18098   7822    8948    21781   5911    20892   5673    12489   6127    9198    13500   4644    12283   4496    8987    3612    14843   12603   15241   7443    2497    11952   5855    2133    8508    1710    0   880 880 3275    15706;
13133   4063    25053   14108   12317   3692    13081   16422   12899   17709   8160    8559    21392   5523    21230   5174    12629   6036    8810    13799   4255    11895   4834    8599    3223    14983   12215   15468   6944    2108    11453   5467    1745    8120    1765    608 0   0   2886    15317;
13133   4063    25053   14108   12317   3692    13081   16422   12899   17709   8160    8559    21392   5523    21230   5174    12629   6036    8810    13799   4255    11895   4834    8599    3223    14983   12215   15468   6944    2108    11453   5467    1745    8120    1765    608 0   0   2886    15317;
14506   3112    26511   15448   13883   2991    12511   17988   14464   17843   9726    8693    21526   5396    22796   4473    13902   7323    8052    15364   5083    12028   6400    7648    4050    16257   11264   17034   6243    3674    12711   4516    2915    7169    3139    2440    1832    1832    0   15451;
27829   13231   22760   29340   16962   14388   4973    16247   11264   2392    14487   7453    11758   10503   19715   13298   10151   11698   8249    21057   11322   4107    13842   8754    12530   12506   9105    25508   12177   13609   25382   11629   14478   10768   16798   15683   15587   15587   15721   0;
];
using JuMP, HiGHS
N = 25 #the problem is NP-hard we do not solve it for all cities
m = Model(optimizer_with_attributes(HiGHS.Optimizer));
JuMP.set_silent(m)
@variable(m, x[f=1:N, t=1:N], Bin)
@objective(m, Min, sum( x[i, j]*dist_mx[i,j] for i=1:N,j=1:N))
@constraint(m, notself[i=1:N], x[i, i] == 0)
@constraint(m, oneout[i=1:N], sum(x[i, 1:N]) == 1)
@constraint(m, onein[j=1:N], sum(x[1:N, j]) == 1)
for f=1:N, t=1:N
    @constraint(m, x[f, t]+x[t, f] <= 1)
end


optimize!(m)
function getcycle(x_v, N)
    x_val = round.(Int, x_v)
    cycle_idx = Vector{Int}()
    push!(cycle_idx, 1)
    while true
        v, idx = findmax(x_val[cycle_idx[end], 1:N])
        if idx == cycle_idx[1]
            break
        else
            push!(cycle_idx, idx)
        end
    end
    cycle_idx
end
getcycle (generic function with 1 method)
getcycle(value.(x), N)
22-element Vector{Int64}:
  1
  4
 23
 18
 14
 21
 25
  2
  6
 16
 24
 19
 12
 22
  7
 10
 13
 17
  9
 11
  5
 20
function solved(m, cycle_idx, N)
    if length(cycle_idx) < N
        cc = @constraint(m, sum(x[cycle_idx,cycle_idx]) <= length(cycle_idx)-1)
        println("Added a constraint since the cycle lenght was $(length(cycle_idx)) < $N")
        return false
    end
    return true
end

solved(m, getcycle(value.(x), N),N)
Added a constraint since the cycle lenght was 22 < 25
false

Handling cycles inspired by the tutorial available at: https://opensourc.es/blog/mip-tsp/

while true
    optimize!(m)
    status = termination_status(m)
    cycle_idx = getcycle(value.(x), N)
    if solved(m, cycle_idx,N)
        break;
    end
end
Added a constraint since the cycle lenght was 5 < 25
Added a constraint since the cycle lenght was 10 < 25
Added a constraint since the cycle lenght was 10 < 25
Added a constraint since the cycle lenght was 15 < 25
Added a constraint since the cycle lenght was 7 < 25
Added a constraint since the cycle lenght was 12 < 25
Added a constraint since the cycle lenght was 15 < 25
Added a constraint since the cycle lenght was 13 < 25
Added a constraint since the cycle lenght was 20 < 25
Added a constraint since the cycle lenght was 18 < 25
println("Trip length $(objective_value(m))")
Trip length 152803.99999999997
using CSV, DataFrames, HTTP
url = "https://szufel.pl/tsl/Subway_LV.csv"
sbws_la = DataFrame(CSV.File(HTTP.get(url).body))
sbws_la
118×5 DataFrame
93 rows omitted
Row long latt name address node
Float64 Float64 String7 String Int64
1 -115.314 36.1003 Subway 10140 W Tropicana Ave, Las Vegas, NV, 89147 5503377919
2 -115.141 36.1148 Subway 1040 E Flamingo, Las Vegas, NV, 89109 2747920069
3 -115.325 36.2187 Subway 10470 W Cheyanne, Las Vegas, NV, 89129 2590273363
4 -115.207 35.9976 Subway 10550 Southern Highlands Pky, Las Vegas, NV, 89141 5514940961
5 -115.244 36.1588 Subway 1105 S Rainbow, Las Vegas, NV, 89102 5794871693
6 -115.137 36.1011 Subway 1196 E Tropicana Ave, Las Vegas, NV, 89119 5717536771
7 -115.08 36.1575 Subway 1300 South Lamb Blvd., Las Vegas, NV, 89104 4807070083
8 -115.26 36.1901 Subway 1750 N Buffalo, Las Vegas, NV, 89128 5778125595
9 -115.206 36.1948 Subway 1940 N Decatur Blvd, Las Vegas, NV, 89108 5797720840
10 -115.062 36.1956 Subway 1961 N Nellis Blvd, Las Vegas, NV, 89115 5521321066
11 -115.208 36.1498 Subway 2003 S Decatur Blvd, Las Vegas, NV, 89102 5844881906
12 -115.144 36.1707 Subway 202 Fremont Street, Las Vegas, NV, 89101 3813314952
13 -115.12 36.2762 Subway 2225 E Centennial Pky, Las Vegas, NV, 89081 5509334872
107 -115.149 36.0855 Subway C Gate McCarran Int Airport, Las Vegas, NV, 89111 4538984240
108 -115.146 36.1715 Subway One South Main St, Las Vegas, NV, 89125 3813314954
109 -115.178 36.2857 Subway 6885 Aliante Parkway, N Las Vegas, NV, 89084 4208824392
110 -115.118 36.2176 Subway 2265 Cheyenne Avenue, North Las Vegas, NV, 89030 5851001667
111 -115.116 36.24 Subway 2546 E Craig Rd, North Las Vegas, NV, 89030 5534608424
112 -115.109 36.2081 Subway 2668 North Las Vegas Blvd, North Las Vegas, NV, 89030 5483724476
113 -115.179 36.2396 Subway 2816 W Craig Rd, North Las Vegas, NV, 89030 5513681535
114 -115.182 36.2617 Subway 3030 W Ann Rd, North Las Vegas, NV, 89031 5512410055
115 -115.193 36.1999 Subway 3950 W Lake Mead Blvd, North Las Vegas, NV, 89032 5788688096
116 -115.097 36.2396 Subway 4375 N Pecos Rd, North Las Vegas, NV, 89030 5534608851
117 -115.155 36.2603 Subway 5546 Camino Al Norte, North Las Vegas, NV, 89031 5096983700
118 -115.18 36.2753 Subway 6360 Simmons St., North Las Vegas, NV, 89031 5512410390
using Plots

cycle_idx = getcycle(value.(x), N)
ids = vcat(cycle_idx, cycle_idx[1])

p = plot(sbws_la.long[ids], sbws_la.latt[ids])

Optional Gurobi example

using JuMP, Gurobi
N = 25 #the problem is NP-hard we do not solve it for all cities
m2 = Model(optimizer_with_attributes(Gurobi.Optimizer));
set_attribute(m2, "TimeLimit", 60)
JuMP.set_silent(m2)
@variable(m2, x[f=1:N, t=1:N], Bin)
@objective(m2, Min, sum( x[i, j]*dist_mx[i,j] for i=1:N,j=1:N))
@constraint(m2, notself[i=1:N], x[i, i] == 0)
@constraint(m2, oneout[i=1:N], sum(x[i, 1:N]) == 1)
@constraint(m2, onein[j=1:N], sum(x[1:N, j]) == 1)
for f=1:N, t=1:N
    @constraint(m2, x[f, t]+x[t, f] <= 1)
end
#optimize!(m2)

function callbackhandle(cb)
    cycle_idx =  getcycle(callback_value.(Ref(cb), x), N)
    println("Callback! N= $N cycle_idx: $cycle_idx of length: $(length(cycle_idx))")
    if length(cycle_idx) < N
        con = @build_constraint(sum(x[cycle_idx,cycle_idx]) <= length(cycle_idx)-1)
        MOI.submit(m2, MOI.LazyConstraint(cb), con)
        print("added a lazy constraint: ")
        println(con)
    end
end
MOI.set(m2, MOI.LazyConstraintCallback(), callbackhandle)
optimize!(m2)
println("Trip length $(objective_value(m2))")

Sample quadratic model - minimizing sum of squares

using LinearAlgebra
# here we generate a sample problem structure
x = rand(Float64,(1000,2))*Diagonal([6,10]) .- [4 3]
ϵ = randn(1000).*2 #errors

a, b, c = 1, 10, 7 # values we are trying to estimate
A = [a b/2;b/2 c]

# y = a * x₁² + b*x₁x₂ + c x₂²
y = (x * A ) .* x * [1;1] .+ ϵ # explained variable
;
y'
1×1000 adjoint(::Vector{Float64}) with eltype Float64:
 -27.7648  84.2602  7.41514  10.4189  …  46.4475  202.429  28.5886  1.53657
using Ipopt
m = Model(optimizer_with_attributes(Ipopt.Optimizer)); 

@variable(m, aa[1:2,1:2])

function errs(aa)
   sum((y .- (x * aa ) .* x * [1;1]) .^ 2)
end

@objective(m, Min, errs(aa))

optimize!(m)

******************************************************************************
This program contains Ipopt, a library for large-scale nonlinear optimization.
 Ipopt is released as open source code under the Eclipse Public License (EPL).
         For more information visit https://github.com/coin-or/Ipopt
******************************************************************************

This is Ipopt version 3.14.13, running with linear solver MUMPS 5.6.1.

Number of nonzeros in equality constraint Jacobian...:        0
Number of nonzeros in inequality constraint Jacobian.:        0
Number of nonzeros in Lagrangian Hessian.............:       10

Total number of variables............................:        4
                     variables with only lower bounds:        0
                variables with lower and upper bounds:        0
                     variables with only upper bounds:        0
Total number of equality constraints.................:        0
Total number of inequality constraints...............:        0
        inequality constraints with only lower bounds:        0
   inequality constraints with lower and upper bounds:        0
        inequality constraints with only upper bounds:        0

iter    objective    inf_pr   inf_du lg(mu)  ||d||  lg(rg) alpha_du alpha_pr  ls
   0  1.4434260e+07 0.00e+00 1.00e+02  -1.0 0.00e+00    -  0.00e+00 0.00e+00   0
   1  4.0406556e+03 0.00e+00 7.00e-04  -1.0 7.00e+00  -4.0 1.00e+00 1.00e+00f  1
   2  4.0406514e+03 0.00e+00 5.20e-09  -5.7 1.56e-04  -4.5 1.00e+00 1.00e+00f  1

Number of Iterations....: 2

                                   (scaled)                 (unscaled)
Objective...............:   1.0587604556415567e-01    4.0406514404900372e+03
Dual infeasibility......:   5.2033165641907263e-09    1.9857927691191438e-04
Constraint violation....:   0.0000000000000000e+00    0.0000000000000000e+00
Variable bound violation:   0.0000000000000000e+00    0.0000000000000000e+00
Complementarity.........:   0.0000000000000000e+00    0.0000000000000000e+00
Overall NLP error.......:   5.2033165641907263e-09    1.9857927691191438e-04


Number of objective function evaluations             = 3
Number of objective gradient evaluations             = 3
Number of equality constraint evaluations            = 0
Number of inequality constraint evaluations          = 0
Number of equality constraint Jacobian evaluations   = 0
Number of inequality constraint Jacobian evaluations = 0
Number of Lagrangian Hessian evaluations             = 1
Total seconds in IPOPT                               = 0.017

EXIT: Optimal Solution Found.
A
2×2 Matrix{Float64}:
 1.0  5.0
 5.0  7.0
value.(aa)
2×2 Matrix{Float64}:
 1.00837  5.00195
 5.00195  6.99828
using Gurobi, JuMP


m = Model(optimizer_with_attributes(Gurobi.Optimizer)); 
@variable(m, aa[1:2,1:2])

function errs(aa)
   sum((y .- (x * aa ) .* x * [1;1]) .^ 2)
end

@objective(m, Min, errs(aa))

optimize!(m)
status = termination_status(m)

println("Cost: $(objective_value(m))")
res = value.(aa)
println("aa=$res")
println("a, b, c = $(res[1,1]), $(res[1,2]+res[2,1]), $(res[2,2])")
Set parameter Username
Academic license - for non-commercial use only - expires 2024-09-01
Gurobi Optimizer version 10.0.2 build v10.0.2rc0 (win64)

CPU model: 13th Gen Intel(R) Core(TM) i7-1355U, instruction set [SSE2|AVX|AVX2]
Thread count: 10 physical cores, 12 logical processors, using up to 12 threads

Optimize a model with 0 rows, 4 columns and 0 nonzeros
Model fingerprint: 0x3c39708a
Model has 10 quadratic objective terms
Coefficient statistics:
  Matrix range     [0e+00, 0e+00]
  Objective range  [2e+05, 4e+06]
  QObjective range [8e+04, 7e+05]
  Bounds range     [0e+00, 0e+00]
  RHS range        [0e+00, 0e+00]
Presolve time: 0.01s
Presolved: 0 rows, 4 columns, 0 nonzeros
Presolved model has 10 quadratic objective terms
Ordering time: 0.00s

Barrier statistics:
 Free vars  : 7
 AA' NZ     : 3.000e+00
 Factor NZ  : 6.000e+00
 Factor Ops : 1.400e+01 (less than 1 second per iteration)
 Threads    : 1

                  Objective                Residual
Iter       Primal          Dual         Primal    Dual     Compl     Time
   0   1.44342596e+07  1.44342596e+07  0.00e+00 3.82e+06  0.00e+00     0s
   1   1.44339482e+07  1.44342596e+07  1.58e-08 3.82e+06  0.00e+00     0s
   2   1.44325405e+07  1.44342596e+07  3.72e-09 3.82e+06  0.00e+00     0s
   3   1.44311135e+07  1.44342594e+07  1.96e-08 3.82e+06  0.00e+00     0s
   4   1.44298081e+07  1.44342593e+07  2.68e-08 3.82e+06  0.00e+00     0s
   5   1.44285376e+07  1.44342590e+07  3.29e-08 3.82e+06  0.00e+00     0s
   6   1.44273214e+07  1.44342588e+07  9.04e-08 3.82e+06  0.00e+00     0s
   7   1.44229613e+07  1.44342574e+07  2.09e-07 3.81e+06  0.00e+00     0s
   8   1.44105260e+07  1.44342498e+07  2.78e-07 3.81e+06  0.00e+00     0s
   9   1.43732750e+07  1.44341950e+07  6.31e-07 3.81e+06  0.00e+00     0s
  10   1.43216429e+07  1.44340390e+07  1.41e-06 3.80e+06  0.00e+00     0s
  11   1.42552714e+07  1.44337011e+07  1.30e-06 3.79e+06  0.00e+00     0s
  12   1.42233001e+07  1.44334829e+07  1.56e-06 3.79e+06  0.00e+00     0s
  13   1.40577195e+07  1.44317708e+07  2.07e-06 3.77e+06  0.00e+00     0s
  14   1.37865496e+07  1.44268240e+07  5.11e-06 3.73e+06  0.00e+00     0s
  15   1.35572961e+07  1.44205155e+07  6.70e-06 3.70e+06  0.00e+00     0s
  16   1.32773447e+07  1.44100932e+07  1.72e-05 3.66e+06  0.00e+00     0s
  17   1.23107168e+07  1.43498046e+07  2.97e-05 3.52e+06  0.00e+00     0s
  18   1.15289591e+07  1.42711491e+07  4.99e-05 3.41e+06  0.00e+00     0s
  19   1.00599881e+07  1.40404258e+07  7.92e-05 3.19e+06  0.00e+00     0s
  20   1.87907187e+06  8.53298174e+06  1.10e-04 1.38e+06  0.00e+00     0s
  21   6.84438572e+05  5.59182113e+06  1.19e-04 8.29e+05  0.00e+00     0s
  22   3.71409807e+03  4.04692001e+03  1.43e-04 8.29e-01  0.00e+00     0s
  23   4.04065140e+03  4.04065145e+03  6.81e-11 8.29e-07  0.00e+00     0s

Barrier solved model in 23 iterations and 0.01 seconds (0.00 work units)
Optimal objective 4.04065140e+03


User-callback calls 85, time in user-callback 0.00 sec
Cost: 4040.6514033079147
aa=[1.008368985622381 5.001945560597719; 5.001945560597719 6.998278968876772]
a, b, c = 1.008368985622381, 10.003891121195439, 6.998278968876772

Modelling pandemic dynamics in a small community

using Ipopt, JuMP, LinearAlgebra
obs_cases = vcat(1,2,4,8,15,27,44,58,55,32,12,3,1,zeros(13))
SI_max = length(obs_cases)
N = 300
300
obs_cases'
1×26 adjoint(::Vector{Float64}) with eltype Float64:
 1.0  2.0  4.0  8.0  15.0  27.0  44.0  …  0.0  0.0  0.0  0.0  0.0  0.0  0.0
using JuMP, Ipopt
m = Model(optimizer_with_attributes(Ipopt.Optimizer, ("print_level"=>2))); 
@variable(m, 0.5 <= α <= 1.5)
@variable(m, 0.05 <= β <= 70)
@variable(m, 0 <= I_[1:SI_max] <= N)

@variable(m, 0 <= S[1:SI_max]  <= N)
@variable(m, ε[1:SI_max])
@constraint(m, ε .== I_ .- obs_cases  )
@constraint(m, I_[1] == 1)
for i=2:SI_max
   @NLconstraint(m, I_[i] == β*(I_[i-1]^α)*S[i-1]/N)
end

@constraint(m, S[1] == N)
for i=2:SI_max
   @constraint(m, S[i] == S[i-1]-I_[i])
end

@NLobjective(m, Min, sum(ε[i]^2 for i in 1:SI_max))
println(m)
Min ε[1] ^ 2.0 + ε[2] ^ 2.0 + ε[3] ^ 2.0 + ε[4] ^ 2.0 + ε[5] ^ 2.0 + ε[6] ^ 2.0 + ε[7] ^ 2.0 + ε[8] ^ 2.0 + ε[9] ^ 2.0 + ε[10] ^ 2.0 + ε[11] ^ 2.0 + ε[12] ^ 2.0 + ε[13] ^ 2.0 + ε[14] ^ 2.0 + ε[15] ^ 2.0 + ε[16] ^ 2.0 + ε[17] ^ 2.0 + ε[18] ^ 2.0 + ε[19] ^ 2.0 + ε[20] ^ 2.0 + ε[21] ^ 2.0 + ε[22] ^ 2.0 + ε[23] ^ 2.0 + ε[24] ^ 2.0 + ε[25] ^ 2.0 + ε[26] ^ 2.0
Subject to
 -I_[1] + ε[1] == -1
 -I_[2] + ε[2] == -2
 -I_[3] + ε[3] == -4
 -I_[4] + ε[4] == -8
 -I_[5] + ε[5] == -15
 -I_[6] + ε[6] == -27
 -I_[7] + ε[7] == -44
 -I_[8] + ε[8] == -58
 -I_[9] + ε[9] == -55
 -I_[10] + ε[10] == -32
 -I_[11] + ε[11] == -12
 -I_[12] + ε[12] == -3
 -I_[13] + ε[13] == -1
 -I_[14] + ε[14] == 0
 -I_[15] + ε[15] == 0
 -I_[16] + ε[16] == 0
 -I_[17] + ε[17] == 0
 -I_[18] + ε[18] == 0
 -I_[19] + ε[19] == 0
 -I_[20] + ε[20] == 0
 -I_[21] + ε[21] == 0
 -I_[22] + ε[22] == 0
 -I_[23] + ε[23] == 0
 -I_[24] + ε[24] == 0
 -I_[25] + ε[25] == 0
 -I_[26] + ε[26] == 0
 I_[1] == 1
 S[1] == 300
 I_[2] - S[1] + S[2] == 0
 I_[3] - S[2] + S[3] == 0
 I_[4] - S[3] + S[4] == 0
 I_[5] - S[4] + S[5] == 0
 I_[6] - S[5] + S[6] == 0
 I_[7] - S[6] + S[7] == 0
 I_[8] - S[7] + S[8] == 0
 I_[9] - S[8] + S[9] == 0
 I_[10] - S[9] + S[10] == 0
 I_[11] - S[10] + S[11] == 0
 I_[12] - S[11] + S[12] == 0
 I_[13] - S[12] + S[13] == 0
 I_[14] - S[13] + S[14] == 0
 I_[15] - S[14] + S[15] == 0
 I_[16] - S[15] + S[16] == 0
 I_[17] - S[16] + S[17] == 0
 I_[18] - S[17] + S[18] == 0
 I_[19] - S[18] + S[19] == 0
 I_[20] - S[19] + S[20] == 0
 I_[21] - S[20] + S[21] == 0
 I_[22] - S[21] + S[22] == 0
 I_[23] - S[22] + S[23] == 0
 I_[24] - S[23] + S[24] == 0
 I_[25] - S[24] + S[25] == 0
 I_[26] - S[25] + S[26] == 0
 α >= 0.5
 β >= 0.05
 I_[1] >= 0
 I_[2] >= 0
 I_[3] >= 0
 I_[4] >= 0
 I_[5] >= 0
 I_[6] >= 0
 I_[7] >= 0
 I_[8] >= 0
 I_[9] >= 0
 I_[10] >= 0
 I_[11] >= 0
 I_[12] >= 0
 I_[13] >= 0
 I_[14] >= 0
 I_[15] >= 0
 I_[16] >= 0
 I_[17] >= 0
 I_[18] >= 0
 I_[19] >= 0
 I_[20] >= 0
 I_[21] >= 0
 I_[22] >= 0
 I_[23] >= 0
 I_[24] >= 0
 I_[25] >= 0
 I_[26] >= 0
 S[1] >= 0
 S[2] >= 0
 S[3] >= 0
 S[4] >= 0
 S[5] >= 0
 S[6] >= 0
 S[7] >= 0
 S[8] >= 0
 S[9] >= 0
 S[10] >= 0
 S[11] >= 0
 S[12] >= 0
 S[13] >= 0
 S[14] >= 0
 S[15] >= 0
 S[16] >= 0
 S[17] >= 0
 S[18] >= 0
 S[19] >= 0
 S[20] >= 0
 S[21] >= 0
 S[22] >= 0
 S[23] >= 0
 S[24] >= 0
 S[25] >= 0
 S[26] >= 0
 α <= 1.5
 β <= 70
 I_[1] <= 300
 I_[2] <= 300
 I_[3] <= 300
 I_[4] <= 300
 I_[5] <= 300
 I_[6] <= 300
 I_[7] <= 300
 I_[8] <= 300
 I_[9] <= 300
 I_[10] <= 300
 I_[11] <= 300
 I_[12] <= 300
 I_[13] <= 300
 I_[14] <= 300
 I_[15] <= 300
 I_[16] <= 300
 I_[17] <= 300
 I_[18] <= 300
 I_[19] <= 300
 I_[20] <= 300
 I_[21] <= 300
 I_[22] <= 300
 I_[23] <= 300
 I_[24] <= 300
 I_[25] <= 300
 I_[26] <= 300
 S[1] <= 300
 S[2] <= 300
 S[3] <= 300
 S[4] <= 300
 S[5] <= 300
 S[6] <= 300
 S[7] <= 300
 S[8] <= 300
 S[9] <= 300
 S[10] <= 300
 S[11] <= 300
 S[12] <= 300
 S[13] <= 300
 S[14] <= 300
 S[15] <= 300
 S[16] <= 300
 S[17] <= 300
 S[18] <= 300
 S[19] <= 300
 S[20] <= 300
 S[21] <= 300
 S[22] <= 300
 S[23] <= 300
 S[24] <= 300
 S[25] <= 300
 S[26] <= 300
 (I_[2] - (β * I_[1] ^ α * S[1]) / 300.0) - 0.0 == 0
 (I_[3] - (β * I_[2] ^ α * S[2]) / 300.0) - 0.0 == 0
 (I_[4] - (β * I_[3] ^ α * S[3]) / 300.0) - 0.0 == 0
 (I_[5] - (β * I_[4] ^ α * S[4]) / 300.0) - 0.0 == 0
 (I_[6] - (β * I_[5] ^ α * S[5]) / 300.0) - 0.0 == 0
 (I_[7] - (β * I_[6] ^ α * S[6]) / 300.0) - 0.0 == 0
 (I_[8] - (β * I_[7] ^ α * S[7]) / 300.0) - 0.0 == 0
 (I_[9] - (β * I_[8] ^ α * S[8]) / 300.0) - 0.0 == 0
 (I_[10] - (β * I_[9] ^ α * S[9]) / 300.0) - 0.0 == 0
 (I_[11] - (β * I_[10] ^ α * S[10]) / 300.0) - 0.0 == 0
 (I_[12] - (β * I_[11] ^ α * S[11]) / 300.0) - 0.0 == 0
 (I_[13] - (β * I_[12] ^ α * S[12]) / 300.0) - 0.0 == 0
 (I_[14] - (β * I_[13] ^ α * S[13]) / 300.0) - 0.0 == 0
 (I_[15] - (β * I_[14] ^ α * S[14]) / 300.0) - 0.0 == 0
 (I_[16] - (β * I_[15] ^ α * S[15]) / 300.0) - 0.0 == 0
 (I_[17] - (β * I_[16] ^ α * S[16]) / 300.0) - 0.0 == 0
 (I_[18] - (β * I_[17] ^ α * S[17]) / 300.0) - 0.0 == 0
 (I_[19] - (β * I_[18] ^ α * S[18]) / 300.0) - 0.0 == 0
 (I_[20] - (β * I_[19] ^ α * S[19]) / 300.0) - 0.0 == 0
 (I_[21] - (β * I_[20] ^ α * S[20]) / 300.0) - 0.0 == 0
 (I_[22] - (β * I_[21] ^ α * S[21]) / 300.0) - 0.0 == 0
 (I_[23] - (β * I_[22] ^ α * S[22]) / 300.0) - 0.0 == 0
 (I_[24] - (β * I_[23] ^ α * S[23]) / 300.0) - 0.0 == 0
 (I_[25] - (β * I_[24] ^ α * S[24]) / 300.0) - 0.0 == 0
 (I_[26] - (β * I_[25] ^ α * S[25]) / 300.0) - 0.0 == 0

optimize!(m)

println("Cost: $(objective_value(m))")
println("α=$(value(α))\nβ=$(value(β))")
Cost: 0.4826914548884895
α=0.9978555010305015
β=2.0069930759045875
sbws_la 
118×5 DataFrame
93 rows omitted
Row long latt name address node
Float64 Float64 String7 String Int64
1 -115.314 36.1003 Subway 10140 W Tropicana Ave, Las Vegas, NV, 89147 5503377919
2 -115.141 36.1148 Subway 1040 E Flamingo, Las Vegas, NV, 89109 2747920069
3 -115.325 36.2187 Subway 10470 W Cheyanne, Las Vegas, NV, 89129 2590273363
4 -115.207 35.9976 Subway 10550 Southern Highlands Pky, Las Vegas, NV, 89141 5514940961
5 -115.244 36.1588 Subway 1105 S Rainbow, Las Vegas, NV, 89102 5794871693
6 -115.137 36.1011 Subway 1196 E Tropicana Ave, Las Vegas, NV, 89119 5717536771
7 -115.08 36.1575 Subway 1300 South Lamb Blvd., Las Vegas, NV, 89104 4807070083
8 -115.26 36.1901 Subway 1750 N Buffalo, Las Vegas, NV, 89128 5778125595
9 -115.206 36.1948 Subway 1940 N Decatur Blvd, Las Vegas, NV, 89108 5797720840
10 -115.062 36.1956 Subway 1961 N Nellis Blvd, Las Vegas, NV, 89115 5521321066
11 -115.208 36.1498 Subway 2003 S Decatur Blvd, Las Vegas, NV, 89102 5844881906
12 -115.144 36.1707 Subway 202 Fremont Street, Las Vegas, NV, 89101 3813314952
13 -115.12 36.2762 Subway 2225 E Centennial Pky, Las Vegas, NV, 89081 5509334872
107 -115.149 36.0855 Subway C Gate McCarran Int Airport, Las Vegas, NV, 89111 4538984240
108 -115.146 36.1715 Subway One South Main St, Las Vegas, NV, 89125 3813314954
109 -115.178 36.2857 Subway 6885 Aliante Parkway, N Las Vegas, NV, 89084 4208824392
110 -115.118 36.2176 Subway 2265 Cheyenne Avenue, North Las Vegas, NV, 89030 5851001667
111 -115.116 36.24 Subway 2546 E Craig Rd, North Las Vegas, NV, 89030 5534608424
112 -115.109 36.2081 Subway 2668 North Las Vegas Blvd, North Las Vegas, NV, 89030 5483724476
113 -115.179 36.2396 Subway 2816 W Craig Rd, North Las Vegas, NV, 89030 5513681535
114 -115.182 36.2617 Subway 3030 W Ann Rd, North Las Vegas, NV, 89031 5512410055
115 -115.193 36.1999 Subway 3950 W Lake Mead Blvd, North Las Vegas, NV, 89032 5788688096
116 -115.097 36.2396 Subway 4375 N Pecos Rd, North Las Vegas, NV, 89030 5534608851
117 -115.155 36.2603 Subway 5546 Camino Al Norte, North Las Vegas, NV, 89031 5096983700
118 -115.18 36.2753 Subway 6360 Simmons St., North Las Vegas, NV, 89031 5512410390
round.(value.(I_), digits=2)
26-element Vector{Float64}:
  1.0
  2.01
  4.0
  7.83
 14.93
 26.94
 43.72
 58.19
 54.95
 31.87
 11.76
  3.43
  0.92
  0.24
  0.06
  0.02
  0.0
  0.0
  0.0
  0.0
  0.0
  0.0
  0.0
  0.0
  0.0
  0.0