Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementing a working example for NeuralDAE #755

Open
nghiahhnguyen opened this issue Jul 31, 2022 · 3 comments
Open

Implementing a working example for NeuralDAE #755

nghiahhnguyen opened this issue Jul 31, 2022 · 3 comments

Comments

@nghiahhnguyen
Copy link

nghiahhnguyen commented Jul 31, 2022

Hi, I am trying to implement a working example for NeuralDAE. Since there is no example for NeuralDAE yet, so after looking at some issues, I chose the test for neural_dae as a starting point. After fixing a few problems, I got the following code:

using DiffEqFlux, Optimization, OrdinaryDiffEq, Lux

#A desired MWE for now, not a test yet.

function rober(du,u,p,t)
  y₁,y₂,y₃ = u
  k₁,k₂,k₃ = p
  du[1] = -k₁*y₁ + k₃*y₂*y₃
  du[2] =  k₁*y₁ - k₃*y₂*y₃ - k₂*y₂^2
  du[3] =  y₁ + y₂ + y₃ - 1
  nothing
end
M = [1. 0  0
     0  1. 0
     0  0  0]
u₀ = [1.0,0.0,0.0]
prob_mm = ODEProblem(ODEFunction(rober,mass_matrix=M),u₀,(0.0,10.0),(0.04,3e7,1e4))
sol = solve(prob_mm,Rodas5(),reltol=1e-8,abstol=1e-8)

tspan=(1e-6, 1e5)

dudt2 = Flux.Chain(x -> x.^3,Flux.Dense(6,50,tanh),Flux.Dense(50,2))

truedu0 = similar(u₀)

ndae = NeuralDAE(dudt2, (u,p,t) -> [u[1] + u[2] + u[3] - 1], tspan, M, DImplicitEuler(),
                        differential_vars = [true,true,false])
f(truedu0,u₀,p,0.0)

ndae(u₀,truedu0,Float64.(ndae.p))

function predict_n_dae(p)
    ndae(u₀,truedu0,p)
end

function loss(p)
    pred = predict_n_dae(p)
    loss = sum(abs2,sol .- pred)
    loss,pred
end

iter = 0
function callback(θ,l)
    global iter
    iter += 1
    if iter%10 == 0
        println(l)
    end
    return false
end

# p = p .+ rand(3) .* p

optfunc = Optimization.OptimizationFunction((x, p) -> loss(x), Optimization.AutoZygote())
optprob = Optimization.OptimizationProblem(optfunc, p)
res = Optimization.solve(optprob, BFGS(initial_stepnorm = 0.0001))

With this code, I ran into the following error.

┌ Warning: Instability detected. Aborting
└ @ SciMLBase /Users/***/.julia/packages/SciMLBase/QzHjf/src/integrator_interface.jl:491
MethodError: no method matching _forward(::Vector{Float64})

The stack trace shows the first reference to this code this line in my code:

    ndae(u₀,truedu0,p)

It seems to be an error deep inside the library, and I'm unsure how to find the exact bug since I'm quite new to Julia in general. When I try to trace the error to the neural_dae implementation, the error comes from this line

solve(prob,n.args...;sensealg=TrackerAdjoint(),n.kwargs...)

I would appreciate any guidance on where I should look to resolve this bug. If everything goes well, I hope that I can make a pull request to use this as an example for NeuralDAE in the documentation!

@nghiahhnguyen
Copy link
Author

I'm sending a link to the full stacktrace here.

@ChrisRackauckas
Copy link
Member

I'm sorry I missed this one. Did you try out of place with DFBDF? I think that was the combo that was being played with.

@ChrisRackauckas
Copy link
Member

Theoretically though, the fully implicit form offers no benefits over the mass matrix DAE form, with many downsides. So I'd highly recommend just using the mass matrix DAE form, because then you always get an index-1 DAE with linear initialization.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants