Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Faster loading of MAG240M feats in DRAM #458

Open
UtkrishtP opened this issue Sep 30, 2023 · 0 comments
Open

Faster loading of MAG240M feats in DRAM #458

UtkrishtP opened this issue Sep 30, 2023 · 0 comments

Comments

@UtkrishtP
Copy link

Hello Team,

I have sufficient DRAM in my system close to 750G, and am looking to load feats in_memory to exploit faster DRAM access. However I see the format stored is .npy which makes the loading process extremely slow.

Like the ogbn_papers100M and other family of datasets we use .npz compressed format and also store in a preprocessed directory in binary format which makes loading times from disk extremely fast.

Is it possible to re-use the same libraries for MAG240M datastets, or is there any workaround?

TIA.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant