Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can Train KAN on batches of data #214

Open
lukmanulhakeem97 opened this issue May 19, 2024 · 4 comments
Open

How can Train KAN on batches of data #214

lukmanulhakeem97 opened this issue May 19, 2024 · 4 comments

Comments

@lukmanulhakeem97
Copy link

While training KAN using entire samples say 10000, end up in memory overflow issue. Instead of giving entire data at once on "dataset" dict, can we train KAN on small batches of data like normally do on ANN?

@KindXiaoming
Copy link
Owner

Hi, you can pass the batch argument in train: https://github.com/KindXiaoming/pykan/blob/master/kan/KAN.py#L761

@lukmanulhakeem97
Copy link
Author

thanks @KindXiaoming, got it 👍

@HajiDr
Copy link

HajiDr commented May 20, 2024

@KindXiaoming can you explain how does the batch argument works in case of KANs. It seems only training and testing on the batch size specified per step. Or i got it wrong?

@sparcycram
Copy link

sparcycram commented Jun 5, 2024

@KindXiaoming

Any thoughts appreciated

I am still having problems OutOfMemoryError with large datasets. Is it possible to modify the code to use Dataloader and pass train_loader and test_loader to train instead of dataset?

So edit this line

def train(self, dataset, opt="LBFGS", steps=100, ......

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants