Skip to content

Conversation

@pulsipher
Copy link
Collaborator

@pulsipher pulsipher commented Jan 3, 2022

This follows from jump-dev/JuMP.jl#2842.

Until JuMP.jl releases this bug fix in the next version, we can use the work around:

using InfiniteOpt
model = InfiniteModel()
@variable(model, z)
my_max(a...) = max(a...)
@register(model, my_max(a, b))
@objective(model, Min, my_max(0, z))

Once, JuMP.jl is updated then we can just register as normal:

using InfiniteOpt
model = InfiniteModel()
@variable(model, z)
@register(model, max(a, b))
@objective(model, Min, max(0, z))

@pulsipher pulsipher added the bug Something isn't working label Jan 3, 2022
@codecov
Copy link

codecov bot commented Jan 3, 2022

Codecov Report

Merging #195 (4e73960) into master (50b71c4) will decrease coverage by 0.00%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #195      +/-   ##
==========================================
- Coverage   99.81%   99.81%   -0.01%     
==========================================
  Files          33       33              
  Lines        7074     7072       -2     
==========================================
- Hits         7061     7059       -2     
  Misses         13       13              
Impacted Files Coverage Δ
src/nlp.jl 100.00% <ø> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 50b71c4...4e73960. Read the comment docs.

@pulsipher pulsipher merged commit b1fede0 into master Jan 3, 2022
@pulsipher pulsipher deleted the nlp_fix branch January 3, 2022 21:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants