-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
dataset #20
Comments
I also have this problem, the requirements listed in your |
What error raises exactly? |
There are my logs, it seems like the process is blocked.
|
|
There are my packages in my anaconda environment.
|
My tfds: 4.4.0+nightly |
Thank you very much. I have solved this problem. |
Due to the network firewall in Chinese Mainland, My suggestion for this is to use servers outside of mainland China to download datasets, or to use pre downloaded datasets to avoid TensorFlow automatically downloading datasets from Google. Anyway, thank you very much for the author's selfless help. @ZhangYuanhan-AI |
Thanks for this Important information, and enjoy NOAH! |
Thanks for the tips from @Maystern. I provide the code for using proxy in get_vtab1k.py: import os
os.environ['HTTP_PROXY'] = 'http://your_proxy_ip:your_proxy_port'
os.environ['HTTPS_PROXY'] = 'http://your_proxy_ip:your_proxy_port' And don't forget to remove the softlink of the original repo 'data/vtab-source/data/vtab' after clone the code. Finally, you will successfully download the datasets. |
Thanks! Using proxy is indeed a good way to download. |
I want to share a more convenient way to download the dataset. Thanks for the efforts of RepAdapter, we can use Google Drive provided by them to download the dataset from their repo. |
Hello, I can't download the VTAB dataset according to your configuration, can you send me a copy of the dataset, my email is 1971733261@qq.com.
And I hope you can provide the versions of the following packages:tensorflow、tensorflow-addons、tensorflow-metadata、tensorflow-datasets、tfds-nightly.
The text was updated successfully, but these errors were encountered: