-
Notifications
You must be signed in to change notification settings - Fork 19.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add backend-agnostic worker-process data loading #19692
Comments
|
Interesting, for me, the import does not work under the torch backend. I thought this was the intended behaviour: conda create -y -n test-py-tensorflow python=3.11
conda activate test-py-tensorflow
pip install -U tensorflow keras
conda env config vars set KERAS_BACKEND=tensorflow
conda deactivate
conda activate test-py-tensorflow
python -c "from keras.utils import PyDataset" works fine, whereas conda create -y -n test-py-torch python=3.11
conda activate test-py-torch
pip install -U keras
conda install pytorch torchvision torchaudio cpuonly -c pytorch
conda env config vars set KERAS_BACKEND=torch
conda deactivate
conda activate test-py-torch
python -c "from keras.utils import PyDataset" yields ImportError: cannot import name 'PyDataset' from 'keras.utils' Edit: Fixed the minimal example |
The import pattern should be |
Sorry, that was a mistake I put in when creating a minimal example. The issue seems to be a little bit deeper. When I install keras using from keras.utils import PyDataset
>>> ImportError: cannot import name 'PyDataset' from 'keras.utils' When I install with import keras
>>> ModuleNotFoundError: No module named 'packaging' |
Hi @LarsKue , Please install the |
The library packaging is a dependency for Keras. Please check here. Line 20 in da83683
|
@SuryanarayanaY thanks, this fixes the issue. |
Worker-process data loading is an integral part of many training applications. Exposing the API provided in
keras.utils.PyDataset
to backends other than tensorflow would be a valuable addition.As an alternative, flags like
workers
,use_multiprocessing
, andmax_queue_size
could be added toModel.fit
, but this may be confusing when the user passes data that is fully in memory.The text was updated successfully, but these errors were encountered: