Skip to content

Advice for training with TPU #2158

Answered by rwightman
gau-nernst asked this question in Q&A
Discussion options

You must be logged in to vote

Technically it probably would be possible to hack something to http stream TFDS from .tfrecord shards in a HF dataset (just raw data afterall). But in any case, you wouldn't want to stream from the hub to GC for TPU training because it's too slow, you'd be wasting the TPUs. You need to copy your dataset to GCS for training with TPUs if you want any sort of reasonable performance and reliability.

I did have timm working quite well with TPUs + PyTorch XLA on an alternate branch with a different API I called bits https://github.com/huggingface/pytorch-image-models/tree/bits_and_tpu/timm/bits ... a few people were using it successfully at the time. However, I lost reliable access to TPUs and …

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@rwightman
Comment options

Answer selected by gau-nernst
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants