Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MemoryError #65

Open
dinosaxon opened this issue Jul 23, 2018 · 1 comment
Open

MemoryError #65

dinosaxon opened this issue Jul 23, 2018 · 1 comment

Comments

@dinosaxon
Copy link

I face a Memory Error trying to train a big model. Is there any way to train using all avalilable GPUs?

I am using a linux machine (gcloud) with 8 GPUs.

Traceback (most recent call last):
File "main.py", line 445, in
train_model(parameters, args.dataset)
File "main.py", line 73, in train_model
dataset = build_dataset(params)
File "/mnt/sdc200/nmt-keras/data_engine/prepare_data.py", line 229, in build_dataset
saveDataset(ds, params['DATASET_STORE_PATH'])
File "/mnt/sdc200/nmt-keras/src/keras-wrapper/keras_wrapper/dataset.py", line 52, in saveDataset
pk.dump(dataset, open(store_path, 'wb'), protocol=-1)
MemoryError

@lvapeab
Copy link
Owner

lvapeab commented Jul 27, 2018

It seems the problem is not the GPU memory, but the RAM memory. How big is the dataset? Can you provide the full output of the training process?

@lvapeab lvapeab closed this as completed Sep 3, 2018
@lvapeab lvapeab reopened this Sep 4, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants