Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom (nonlinear) regressors for ARX #66

Open
jankap opened this issue Mar 18, 2021 · 2 comments
Open

Custom (nonlinear) regressors for ARX #66

jankap opened this issue Mar 18, 2021 · 2 comments

Comments

@jankap
Copy link

jankap commented Mar 18, 2021

First, big thank you. This package reads like a 1:1 implementation of ML ident TB, absolutely awesome.

However, I wonder if it's possible to customize the regressors for the ARX models to get some fancy models :) e.g. I've had some good success with polynomial regressor Sets with cross terms via nlarx in ml.

If not, what would you need to get it implemented?

Thanks!

@baggepinnen
Copy link
Owner

baggepinnen commented Mar 18, 2021

Thank you for your kind words :)
We do not have any built in support for nonlinear ARX models. There are a few alternatives available

  • If the nonlinearity is on the input only, you may expand the input in a nonlinear basis. Tools for this are available here https://github.com/baggepinnen/BasisFunctionExpansions.jl#lpv-arx-modeling
  • If the nonlinarity is more complicated, the best strategy I've found is to simply use the lower level features together with the fact that almost all julia code is differentiable, to simply perform gradient descent on any model with any cost function. I believe the nlarx support in matlab came in a time (and language) where computing gradients through a model had to be supported explicitly by the software. In julia, there are things like DiffEqFlux.jl that is much more powerful and general than the nlarx support in matlab.

Below is a simple example making use of BasisFunctionExpansions that expands both inputs and outputs in two separate bases to perform estimation of a Hammerstein-Wiener model, using Optim and automatic differentiation to optimize the parameters.

using Optim, BasisFunctionExpansions, ControlSystemIdentification
dn = iddata(y, ......................)
na,nb,nl = 10,6,6
bfe = UniformRBFE(dn.y, nl)
bfeu = UniformRBFE(dn.u, nl)

y, A = ControlSystemIdentification.getARXregressor(dn.y, dn.u, na, nb)
forward = function (p)
    bfa = BasisFunctionApproximation(bfe, p[2na+nb.+(1:nl)])
    bfau = BasisFunctionApproximation(bfeu, p[2na+nb+nl.+(1:nl)])
    ynl = bfa(dn.y)
    unl = bfa(dn.u)
    _, A_nl = ControlSystemIdentification.getARXregressor(ynl, unl, na, nb)
    yhl = [A A_nl] * p[1:2na+2nb]
end
costfun = p->(mean((y .- forward(p)).^2))

p0 = [A\y; 0.1randn(na + nb + 2nl)]
costfun(p0)
res = Optim.optimize(
    costfun,
    p0,
    BFGS(),
    Optim.Options(
        show_trace = true,
        show_every = 1,
        iterations = 5000,
        time_limit = 50,
        g_tol = 1e-8,
    ),
    autodiff=:forward,
)

yarx = A*(A\y)
yhl = forward(res.minimizer)
plot(
    plot([y yarx yhl], lab=["y" "y_{arx}" "y_{lpv]"], linewidth=[2 1 1]),
    plot(y-yhl, ylims=(-0.2, 0.2))
)

@baggepinnen
Copy link
Owner

I just merged some new functionality that allows for estimation of Hammerstein-Wiener models, it's not identical to NARX, but should hopefully cover at least some situations

https://baggepinnen.github.io/ControlSystemIdentification.jl/dev/nonlinear/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants