Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model evaluation problems encountered after several derivations #356

Open
patquem opened this issue Oct 25, 2022 · 6 comments
Open

Model evaluation problems encountered after several derivations #356

patquem opened this issue Oct 25, 2022 · 6 comments

Comments

@patquem
Copy link

patquem commented Oct 25, 2022

Hello,
In addition to the one reported in #355 , other issues can be encountered when doing for instance model derivations using the symfit framework.
Starting for instance with a simple model1 defined with the expressiony(x; a, b) = a*x + b
After one derivation, the derivative model2 is created with the related expression: y(x; a) = a but you can't evaluate it with the same original parameters a and b : model2(x, a=1, b=2) because the b parameter is missing. This is quite embarassing when working with complicated model and n-order model derivation ... (with no idea of which parameters will still be present in the final expression).
Another problem is when evaluating the last expression y(x; a) = a with x as an array, model2(x, a=1) will return a 'singleton' [a] and not [a, a, ...a] of same size as x.
This could be fix in the future ?
Thanks
Patrick

@pckroon
Copy link
Collaborator

pckroon commented Oct 27, 2022

I think a reasonable solution would be to be able to provide a signature to Model, which should override the automatically generated one. This is something that's going to break for someone at some point, but eh. I can try to mock up a PR.
The broadcasting is also an interesting problem. There is np.broadcast_to, which will solve the issue. The question is whether the desired output shape (per component of the model!) should be a model argument...

In the meantime, a Model has a __signature__ attribute. Set that to the signature of your original Model, and things should work?

@tBuLi What's your take on this?

@patquem
Copy link
Author

patquem commented Oct 27, 2022

Thanks @pckroon for your answer.
I am not sure the __signature__ atribute is enough to raise the problem (or, I don't see how. The same for np.broadcast_to).
FYI, hereafter the work around I have coded to evaluate the model derivatives with the same parameters that the ones used for the model evaluation.
Considering a single expression y(x; ...) for the model:

der = 1 or 2 or ....
y_der = {}
for _, expr in self.model.function_dict.items():
    y_der[y] = expr.diff(x, der)
if len(y_der[y].free_symbols) == 0:   #  --> to manage y(x)=Zero and avoid error when calling CallableModel()
    return np.zeros_like(x_eval)
else:
    model_eval = CallableModel(y_der) 

# 'params_eval' creation wrt parameters still present after derivation
params_eval = {}
for key, val in self.params.items():  #  -->  self.params contains the values passed for the model evaluation
    if Parameter(key) in model_eval.params:
        params_eval[key] = val

if len(model_eval.independent_vars) > 0:  
    res = model_eval(x_eval, **params_eval)[0]
else:
    res = model_eval(**params_eval)[0] * np.ones_like(xeval)  # --> to manage cases where 'x' is no longer in the expression

return res

All this is not really elegant.
This is the reason of my issues reporting :)

Patrick

@tBuLi
Copy link
Owner

tBuLi commented Oct 28, 2022

Hi Patrick, thanks for your input. I am not sure that I understand the issue though, perhaps you can let me know if I understood you correctly. Your problem seems to be that you want to derive with respect to the variables instead of the parameters, but keep the same signature? That is indeed not supported currently.

However, to improve on your workaround, it might help to know that symfit uses Model.connectivity_mapping to make its call signature. For your example,

>>> x, y = variables('x, y')
>>> a, b = parameters('a, b')
>>> model = Model({y: a*x + b})
>>> model.connectivity_mapping
{y: {x, a, b}}

So for your derivative, you could do

yder, = variables('yder')
connectivity_mapping = {}
connectivity_mapping[yder] = model.connectivity_mapping[y]
diff = NumericalModel({yder: y.diff(x)}, connectivity_mapping=connectivity_mapping)

I think this should work. Unfortunately, we have to use a NumericalModel because the Model class does not currently allow the user to set the connectivity_mapping, and instead generates it itself. So maybe that could be addressed in a PR.

Never mind, I now realize that this does not sufficiently address your problem. My other solution is to put all derivatives and the original in the same model.

>>> x, y, yder = variables('x, y, yder')
>>> a, b = parameters('a, b')
>>> model = Model({y: a*x + b, yder: a})

In this way, symfit should be able to guess the broadcasting rules correctly. But at the cost of evaluating everything at the same time, which might be undesirable.

@pckroon
Copy link
Collaborator

pckroon commented Oct 28, 2022

My interpretation of the root problem is that there is no way to set a Model's call signature (+ return values/shape), rather than rely on the automated mechanism.
In specific cases (like the one here) a way to tell model (and fit, and everything else downstream) to ignore additional arguments or reshape output would result in significant cleaner code.

@patquem
Copy link
Author

patquem commented Oct 28, 2022

@tBuLi , @pckroon
Thank you all for your replies
I have tested the @tBuLi 's proposals. The 1rst one with NumericalModel (CallableNumericalModel instead ?) returns the following error message:

>   File "C:\dev\sw\envs3\env_2208\lib\site-packages\symfit\core\models.py", line 329, in _init_from_dict
>     self.dependent_vars = sorted(ordered.pop(-1), key=sort_func)
> IndexError: pop from empty list

Concerning the 2nd proposal : model = Model({y: a*x + b, yder: a}) I orignally coded something similar (inspired from what is done in symfit for the Jacobian and Hessian model calculation). But as mentioned by @tBuLi, I gave up this solution which implies to evaluate things that are not desirable.
Regarding the @pckroon's answer, I understand my request would require significant works in symfit and there is not a simple and fast fix to hope. Too bad.
Thanks again guys.
Do what you can do.
Patrick

@pckroon
Copy link
Collaborator

pckroon commented Oct 31, 2022

Regarding the @pckroon's answer, I understand my request would require significant works in symfit and there is not a simple and fast fix to hope. Too bad.

Doesn't mean I don't like the idea, or that it won't get implemented ;) I'm just not making any claims regarding timelines

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants