Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scalability / Datasets larger than RAM #43

Open
gnewton opened this issue Feb 11, 2019 · 2 comments
Open

Scalability / Datasets larger than RAM #43

gnewton opened this issue Feb 11, 2019 · 2 comments

Comments

@gnewton
Copy link

gnewton commented Feb 11, 2019

Hello,

I have not tried this out yet (I am excited to) but was wondering how this dealt with very large datasets, particularly those larger than RAM. Can wego handle this use case?

Thanks,
Glen

@ynqa
Copy link
Owner

ynqa commented Feb 12, 2019

@gnewton Thanks for your indication, and which is quite right. Currently, wego is not able to train models from huge datasets that are unacceptable for memory. Now I'm just tackling this issue. Please wait for resolved :)

@estebarb
Copy link

Any guidance on RAM required per words?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants