-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How can Train KAN on batches of data #214
Comments
Hi, you can pass the |
thanks @KindXiaoming, got it 👍 |
@KindXiaoming can you explain how does the batch argument works in case of KANs. It seems only training and testing on the batch size specified per step. Or i got it wrong? |
Any thoughts appreciated I am still having problems OutOfMemoryError with large datasets. Is it possible to modify the code to use Dataloader and pass train_loader and test_loader to train instead of dataset? So edit this line def train(self, dataset, opt="LBFGS", steps=100, ...... |
While training KAN using entire samples say 10000, end up in memory overflow issue. Instead of giving entire data at once on "dataset" dict, can we train KAN on small batches of data like normally do on ANN?
The text was updated successfully, but these errors were encountered: