Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: PySR runs well once and then stops after error #424

Open
BMP-TUD opened this issue Sep 13, 2023 · 1 comment
Open

[BUG]: PySR runs well once and then stops after error #424

BMP-TUD opened this issue Sep 13, 2023 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@BMP-TUD
Copy link

BMP-TUD commented Sep 13, 2023

What happened?

Hello,

I was trying to use PySR and I ran into a problem: I ran it once and the model was able to identify the equation correctly. However, after trying to run my code on other data, nothing happens but the code stops at the following error (see below)

I am not sure if I am causing this problem or what the problem could be. I am running the code in Python 3.11.0 and Julia 1.8.5. If there is already an issue that would help, then sorry for posting the same question twice. I hope that you can help me in resolving this problem.

Best wishes,
Bartosz

Version

0.16.3

Operating System

Windows

Package Manager

pip

Interface

Jupyter Notebook

Relevant log output

UserWarning                               Traceback (most recent call last)
Cell In[45], line 19
      1 from pysr import PySRRegressor
      3 model = PySRRegressor(
      4     niterations=40,  # < Increase me for better results
      5     binary_operators=["+", "*", "-"],
   (...)
     17     progress=False
     18 )
---> 19 model.fit(x_train_ic,x_dot)

File ~\Anaconda3\envs\tristan\Lib\site-packages\pysr\sr.py:1904, in PySRRegressor.fit(self, X, y, Xresampled, weights, variable_names, X_units, y_units)
   1900 seed = random_state.get_state()[1][0]  # For julia random
   1902 self._setup_equation_file()
-> 1904 mutated_params = self._validate_and_set_init_params()
   1906 (
   1907     X,
   1908     y,
   (...)
   1915     X, y, Xresampled, weights, variable_names, X_units, y_units
   1916 )
   1918 if X.shape[0] > 10000 and not self.batching:

File ~\Anaconda3\envs\tristan\Lib\site-packages\pysr\sr.py:1346, in PySRRegressor._validate_and_set_init_params(self)
   1344         parameter_value = 1
   1345     elif parameter == "progress" and not buffer_available:
-> 1346         warnings.warn(
   1347             "Note: it looks like you are running in Jupyter. "
   1348             "The progress bar will be turned off."
   1349         )
   1350         parameter_value = False
   1351 packed_modified_params[parameter] = parameter_value

UserWarning: Note: it looks like you are running in Jupyter. The progress bar will be turned off.

Extra Info

This the minimal example, the x_train_ic is just a time series and x_dot the derivatives of it.

from pysr import PySRRegressor

model = PySRRegressor(
    niterations=40,  # < Increase me for better results
    binary_operators=["+", "*", "-"],
    #unary_operators=[
    #    "cos",
    #    "exp",
    #    "sin",
    #    "inv(x) = 1/x",
        # ^ Custom operator (julia syntax)
    #],
    #extra_sympy_mappings={"inv": lambda x: 1 / x},
    # ^ Define operator for SymPy as well
    loss="loss(prediction, target) = (prediction - target)^2",
    # ^ Custom loss function (julia syntax)
    progress=False
)
model.fit(x_train_ic,x_dot)
@BMP-TUD BMP-TUD added the bug Something isn't working label Sep 13, 2023
@MilesCranmer
Copy link
Owner

When you say run it a second time what do you mean? Could you paste the entire example both runs?

Also it doesn’t look like there’s an error here. Is the code maybe still running but just running very slowly? You could see if it is still using the CPU in the task manager for example, or if it actually exited.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants