New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Out of memory when to learn deep forest. #7
Comments
OK. I will try officaly implementation. |
I tried to use official implementation of gcForest. I have a question. |
I tried some of the tests about this problem, then I got a solution. Note: |
Thaks for your implementation!
When I want to train full of datasets as MNIST, example code is crashed because out of memory.
I commented out some lines that are limit the size of dataset about following codes.
Is these changes something to wrong?
Or does my computer have insufficient memory?
My testing computer is following specs.
OS: Windows 7
CPU: Core i7 970
Memory: 32GB (I only use real memories.)
If this problem causes insufficient memory, I want to know how to economize memory.
(e.g. like mini-batch training in deep neural network)
The text was updated successfully, but these errors were encountered: