You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello. I want to use this library to compare my from scratch code's results but when I try to use your washout parameter, I come across with this error. I know the reason behind this error but could not find the solution without changing the actual source code. Am I missing something about this repo?
Thank you for this wonderful repo.
The text was updated successfully, but these errors were encountered:
mustafakucuk0
changed the title
How can I use wash_nr_time_step correctly?
How can I use wash_nr_time_step correctly in generative task?
Dec 22, 2021
In this case, yes, I was using 0.2. I modified your hyperopt tutorial code to generate Lorenz63 time series. With 6000 steps given, I wanted to model to create the last 2000 steps and optimize the model throughout the parameter pool.
The main aim of this study was to compare my from-scratch code results and this optimized library's results. But in your code, firstly, I have to give X_train and y_train to the model but I designed my RC code to take only one training array for training phase and one test array for test phase.
I could not understand why we have to give y_train again? FYI, I am not 100% sure about my code. As you can see in the image, I start to collect states after number of transient steps.
But for example, in your code, I try to discard transient but I come across this error:
It's about ridge regression part but I do not know how to deal with it. I hope I could tell my story.
Hello. I want to use this library to compare my from scratch code's results but when I try to use your washout parameter, I come across with this error. I know the reason behind this error but could not find the solution without changing the actual source code. Am I missing something about this repo?
Thank you for this wonderful repo.
The text was updated successfully, but these errors were encountered: