Conversation
| See also: [`NonAdaptiveLoss`](@ref), [`GradientScaleAdaptiveLoss`](@ref), | ||
| [`MiniMaxAdaptiveLoss`](@ref). | ||
| """ | ||
| abstract type AbstractAdaptiveLoss end |
There was a problem hiding this comment.
the changes from the other branch are being caught here
| dv_op = split(string(dv), "(")[1] | ||
| call_pat = Regex("\\b" * dv_op * "\\([^\\)]*\\)") | ||
| res_str = replace(res_str, call_pat => "NN_$(j)($(iv_list))") |
There was a problem hiding this comment.
😅 no don't generate to strings, generate to symbolic expressions directly for build_function
| if chain === nothing | ||
| # We manually construct a fully connected network as fallback | ||
| # If ModelingToolkitNeuralNets is not available we could build manually via Lux | ||
| chain = Lux.Chain(Lux.Dense(length(ivs), width, activation), Lux.Dense(width, length(dvs))) |
There was a problem hiding this comment.
just take it in, simpler than an auto api
| elseif symbolic_expression_style == :compact | ||
| _build_compact_symbolic_loss(eqs, bcs, dvs, ivs; n_points=n_points, bc_weight=bc_weight) |
There was a problem hiding this comment.
remove these, not necessary
|
you're going to want to use https://github.com/SciML/ModelingToolkitNeuralNets.jl for the registration to work out, in particular https://docs.sciml.ai/ModelingToolkitNeuralNets/stable/symbolic_ude_tutorial/ |
thanks for the heads up |
Checklist
contributor guidelines, in particular the SciML Style Guide and
COLPRAC.
Additional context
Context / Background
As part of the continuous evolution of NeuralPDE.jl and preparation for the GSoC 2026 project "Modelingtoolkit based Parser for Physics Informed Neural Networks", this PR introduces a highly experiential prototype MVP that natively bridges ModelingToolkit equations to compiled Lux loss functions without manual Julia AST string/macro manipulation.
Currently, the build_symbolic_loss_function pipeline constructs explicit Expr trees. By instead lowering to Symbolics.build_function, we allow the compiler to build structurally sound evaluation pipelines spanning arbitrary boundary conditions and multi-variable differential operations via Lazy Grid-Sums.
Changes Made:-
New Core File: Added src/symbolic_pinn_parser.jl
Parser Logic: Ported the _replace_dv_calls and build_pinn_loss routines perfectly mapping symbolic Differential operators onto Lux.apply operations dynamically at compilation time.
Export Alignment: Exported build_pinn_loss seamlessly into the core module framework (src/NeuralPDE.jl).
Demonstration: Included a demo_symbolic_parser.jl script showing exactly how to execute a 1D Advection Equation mapping from PDESystem directly explicitly printed symbolic templates demo_symbolic_expression.txt.
Verification:-
Confirmed the 1D Advection PDE solves correctly via Optimization with 0.27 Max Error across a 10-point grid spanning a trivial 1-layer Neural Network.
Generated perfectly matching expanded Symbolic equation blocks exactly mapping layer evaluations directly to grid residuals perfectly scaled.
All structural changes remain completely non-breaking to the existing API structure, adding purely additive experimental features.
Add any other context about the problem here.