Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OOM Embedding #2

Open
linusic opened this issue Apr 29, 2019 · 1 comment
Open

OOM Embedding #2

linusic opened this issue Apr 29, 2019 · 1 comment

Comments

@linusic
Copy link

linusic commented Apr 29, 2019

My Vocab size is 10k+
num_encoder_symbols=num_encoder_symbols,
num_decoder_symbols=num_decoder_symbols,
so much params and OOM...................
12G Memory
batch_size = 1
unit = 8
still OOM......................

@DataXujing
Copy link
Owner

Now u batch_size is 1. If u train in CUDA GPU u can config some params in lx_bot_3.py line 195 or made the input_seq_len and output_seq_len smaller

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants