Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can Talos Work with Unsupervised Learning on LSTM/Autoencoder Model #533

Open
krwiegold opened this issue Jan 20, 2021 · 7 comments
Open
Assignees

Comments

@krwiegold
Copy link

Hi, I am trying to use Talos to optimize the hyperparameters on an unsupervised LSTM/Autoencoder model. The model works without Talos. Since I do not have y data (no known labels / dependent variables), so I created my model as follows below. And the data input is called "scaled_data".

set parameters for Talos

p = {'optimizer': ['Nadam', 'Adam', 'sgd'],
'losses': ['binary_crossentropy', 'mse'],
'activation':['relu', 'elu']}

create autoencoder model

def create_model(X_input, y_input, params):
autoencoder = Sequential()
autoencoder.add(LSTM(12, input_shape=(scaled_data.shape[1], scaled_data.shape[2]), activation=params['activation'],
return_sequences=True, kernel_regularizer=tf.keras.regularizers.l2(0.01)))
autoencoder.add(LSTM(4, activation=params['activation']))
autoencoder.add(RepeatVector(scaled_data.shape[1]))
autoencoder.add(LSTM(4, activation=params['activation'], return_sequences=True))
autoencoder.add(LSTM(12, activation=params['activation'], return_sequences=True))
autoencoder.add(TimeDistributed(Dense(scaled_data.shape[2])))
autoencoder.compile(optimizer=params['optimizer'], loss=params['losses'], metrics=['acc'])

history = autoencoder.fit(X_input, y_input, epochs=10, batch_size=1, validation_split=0.0,
                          callbacks=[EarlyStopping(monitor='acc', patience=3)]).history

return autoencoder, history

scan_object = talos.Scan(x=scaled_data, y=scaled_data, params=p, model=create_model, experiment_name='LSTM')

My error says: TypeError: create_model() takes 3 positional arguments but 5 were given.

How am I passing 5 arguments? Any ideas how to fix this issue? I looked through the documents and other questions, but don't see anything with an unsupervised model. Thank you!

@mikkokotila
Copy link
Contributor

def create_model(X_input, y_input, params) is wrong. You must declare it exactly like the docs explain for the input model.

For example:

def iris_model(x_train, y_train, x_val, y_val, params):

@mikkokotila mikkokotila self-assigned this Feb 14, 2021
@mikkokotila mikkokotila added the user support nothing is wrong with Talos label Feb 14, 2021
@krwiegold
Copy link
Author

My model is unsupervised, so I do not have the "y" dataset. Does Talos only work for supervised models?

@mikkokotila mikkokotila changed the title Error using Talos with Unsupervised Learning on LSTM/Autoencoder Model Can Talos Work with Unsupervised Learning on LSTM/Autoencoder Model Jan 29, 2022
@mikkokotila mikkokotila added discussion and removed user support nothing is wrong with Talos labels Jan 29, 2022
@alexcwsmith
Copy link

I am also wondering this, @krwiegold did you ever find a way to make this work?

@krwiegold
Copy link
Author

@alexcwsmith I could never get it to work unfortunately. I had to give up on talos.

@mikkokotila
Copy link
Contributor

Let's have a look into this. Some of the higher priority items like full support for multi-input models, and distributed experiments have now been completed, so I think this could very well be next. It's a very interesting problem, given there is no truth to optimize for.

@mikkokotila
Copy link
Contributor

@krwiegold @alexcwsmith can you help and share one or two code complete examples in Google Colab where such a model is running without Talos. Also, had you any thoughts about the possible ways to implement the support into Talos.

@alexcwsmith
Copy link

Thanks @mikkokotila
I'm not a colab user, tried to get this to run in colab for a bit but don't even know the basics so that seems like a steep learning curve to run a simple script... if that is the only way you can run this, I can have someone who knows colab get this in there for you.

The simplest example I think is the VAE example from PyTorch here:

https://github.com/pytorch/examples/tree/main/vae

And as far as possible ways to implement talos with a VAE, simply running a scan to find parameters that minimize the loss, or the KL Divergence, would be a great start.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants