Merged
Conversation
Add binary broadcasting support and LinearAlgebra.norm for JuMP array expressions, then use them in the neural.jl example to minimize the L2 loss between network output and target data. https://claude.ai/code/session_01GWT1QHA3D5BpMQBEHvgbcV
Test all three binary broadcasting dispatches (JuMPArray-Array, Array-JuMPArray, JuMPArray-JuMPArray), LinearAlgebra.norm on array expressions, and the full L2 loss pipeline from the neural example. https://claude.ai/code/session_01GWT1QHA3D5BpMQBEHvgbcV
Required for the `import LinearAlgebra` in src/JuMP/operators.jl to extend `LinearAlgebra.norm` for array expressions. https://claude.ai/code/session_01GWT1QHA3D5BpMQBEHvgbcV
JuMP's standard @objective cannot convert ArrayDiff's GenericArrayExpr into MOI format. The example and test now construct the L2 loss expression without setting it as a JuMP objective. https://claude.ai/code/session_01GWT1QHA3D5BpMQBEHvgbcV
JuMP defines `LinearAlgebra.norm(::AbstractArray{<:AbstractJuMPScalar})`
which throws UnsupportedNonlinearOperator. Our AbstractJuMPArray types
have elements that are AbstractJuMPScalar, causing ambiguity. Constrain
both the container and element type to resolve it.
https://claude.ai/code/session_01GWT1QHA3D5BpMQBEHvgbcV
Define LinearAlgebra.norm for GenericArrayExpr and ArrayOfVariables
separately instead of for the abstract AbstractJuMPArray type. This
avoids any dispatch ambiguity with JuMP's error-throwing norm method
for AbstractArray{<:AbstractJuMPScalar}.
https://claude.ai/code/session_01GWT1QHA3D5BpMQBEHvgbcV
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #38 +/- ##
==========================================
+ Coverage 90.82% 90.92% +0.09%
==========================================
Files 20 20
Lines 2377 2391 +14
==========================================
+ Hits 2159 2174 +15
+ Misses 218 217 -1 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
JuMP's GenericNonlinearExpr constructor validates arguments via _is_real(). GenericArrayExpr was missing this method, causing norm() to fail when wrapping array expressions in NonlinearExpr. Also consolidate the L2 loss tests into a single comprehensive test. https://claude.ai/code/session_01GWT1QHA3D5BpMQBEHvgbcV
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Add binary broadcasting support and LinearAlgebra.norm for JuMP array
expressions, then use them in the neural.jl example to minimize the L2
loss between network output and target data.
https://claude.ai/code/session_01GWT1QHA3D5BpMQBEHvgbcV