Skip to content

Add L2 loss to neural network example#38

Merged
blegat merged 21 commits intomainfrom
claude/add-l2-loss-neural-LqaIX
Apr 3, 2026
Merged

Add L2 loss to neural network example#38
blegat merged 21 commits intomainfrom
claude/add-l2-loss-neural-LqaIX

Conversation

@blegat
Copy link
Copy Markdown
Owner

@blegat blegat commented Apr 2, 2026

Add binary broadcasting support and LinearAlgebra.norm for JuMP array
expressions, then use them in the neural.jl example to minimize the L2
loss between network output and target data.

https://claude.ai/code/session_01GWT1QHA3D5BpMQBEHvgbcV

claude added 7 commits April 2, 2026 14:31
Add binary broadcasting support and LinearAlgebra.norm for JuMP array
expressions, then use them in the neural.jl example to minimize the L2
loss between network output and target data.

https://claude.ai/code/session_01GWT1QHA3D5BpMQBEHvgbcV
Test all three binary broadcasting dispatches (JuMPArray-Array,
Array-JuMPArray, JuMPArray-JuMPArray), LinearAlgebra.norm on array
expressions, and the full L2 loss pipeline from the neural example.

https://claude.ai/code/session_01GWT1QHA3D5BpMQBEHvgbcV
Required for the `import LinearAlgebra` in src/JuMP/operators.jl to
extend `LinearAlgebra.norm` for array expressions.

https://claude.ai/code/session_01GWT1QHA3D5BpMQBEHvgbcV
JuMP's standard @objective cannot convert ArrayDiff's GenericArrayExpr
into MOI format. The example and test now construct the L2 loss
expression without setting it as a JuMP objective.

https://claude.ai/code/session_01GWT1QHA3D5BpMQBEHvgbcV
JuMP defines `LinearAlgebra.norm(::AbstractArray{<:AbstractJuMPScalar})`
which throws UnsupportedNonlinearOperator. Our AbstractJuMPArray types
have elements that are AbstractJuMPScalar, causing ambiguity. Constrain
both the container and element type to resolve it.

https://claude.ai/code/session_01GWT1QHA3D5BpMQBEHvgbcV
Define LinearAlgebra.norm for GenericArrayExpr and ArrayOfVariables
separately instead of for the abstract AbstractJuMPArray type. This
avoids any dispatch ambiguity with JuMP's error-throwing norm method
for AbstractArray{<:AbstractJuMPScalar}.

https://claude.ai/code/session_01GWT1QHA3D5BpMQBEHvgbcV
@codecov
Copy link
Copy Markdown

codecov bot commented Apr 3, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 90.92%. Comparing base (9c50c6e) to head (261aea0).
⚠️ Report is 1 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main      #38      +/-   ##
==========================================
+ Coverage   90.82%   90.92%   +0.09%     
==========================================
  Files          20       20              
  Lines        2377     2391      +14     
==========================================
+ Hits         2159     2174      +15     
+ Misses        218      217       -1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

claude added 14 commits April 3, 2026 15:08
JuMP's GenericNonlinearExpr constructor validates arguments via
_is_real(). GenericArrayExpr was missing this method, causing norm()
to fail when wrapping array expressions in NonlinearExpr. Also
consolidate the L2 loss tests into a single comprehensive test.

https://claude.ai/code/session_01GWT1QHA3D5BpMQBEHvgbcV
@blegat blegat merged commit 9430d09 into main Apr 3, 2026
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants