Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KeyError on maDLC import #1649

Open
3 tasks
dprotter opened this issue Jan 5, 2024 · 1 comment
Open
3 tasks

KeyError on maDLC import #1649

dprotter opened this issue Jan 5, 2024 · 1 comment
Assignees
Labels
bug Something isn't working fixed in future release Fix or feature is merged into develop and will be available in future release.

Comments

@dprotter
Copy link

dprotter commented Jan 5, 2024

Bug description

Unable to load maDLC training data into SLEAP due to key error, similar to #676 and #702

Expected behaviour

Expected to import using File -> import -> DeepLabCut Dataset, and selecting either the .yaml file or a training data CSV file. This is a dataset that has been moved from another computer, which may have an effect.

Actual behaviour

Import fails with a KeyError

Your personal set up

Environment packages absl-py==1.0.0 astunparse==1.6.3 attrs @ file:///home/conda/feedstock_root/build_artifacts/attrs_1640799537051/work backports.zoneinfo==0.2.1 cached-property @ file:///home/conda/feedstock_root/build_artifacts/cached_property_1615209429212/work cachetools==4.2.4 cattrs @ file:///home/conda/feedstock_root/build_artifacts/cattrs_1604136207372/work certifi @ file:///home/conda/feedstock_root/build_artifacts/certifi_1700303426725/work/certifi charset-normalizer==2.0.9 cloudpickle @ file:///home/conda/feedstock_root/build_artifacts/cloudpickle_1674202310934/work cycler @ file:///home/conda/feedstock_root/build_artifacts/cycler_1635519461629/work cytoolz @ file:///home/conda/feedstock_root/build_artifacts/cytoolz_1657553457169/work dask @ file:///home/conda/feedstock_root/build_artifacts/dask-core_1644602974678/work efficientnet==1.0.0 flatbuffers==2.0 fonttools @ file:///home/conda/feedstock_root/build_artifacts/fonttools_1666389892786/work fsspec @ file:///home/conda/feedstock_root/build_artifacts/fsspec_1674184942191/work gast==0.4.0 google-auth==2.3.3 google-auth-oauthlib==0.4.6 google-pasta==0.2.0 grpcio==1.43.0 h5py @ file:///home/conda/feedstock_root/build_artifacts/h5py_1604753641401/work hdmf==3.6.1 idna==3.3 image-classifiers==1.0.0 imagecodecs @ file:///home/conda/feedstock_root/build_artifacts/imagecodecs_1644819473370/work imageio @ file:///home/conda/feedstock_root/build_artifacts/imageio_1702571712725/work imgaug @ file:///home/conda/feedstock_root/build_artifacts/imgaug_1640909786103/work imgstore==0.2.9 importlib-metadata==4.10.0 importlib-resources==5.12.0 joblib @ file:///home/conda/feedstock_root/build_artifacts/joblib_1691577114857/work jsmin @ file:///home/conda/feedstock_root/build_artifacts/jsmin_1642532731678/work jsonpickle==1.2 jsonschema==4.17.3 keras==2.7.0 Keras-Applications==1.0.8 Keras-Preprocessing==1.1.2 kiwisolver @ file:///home/conda/feedstock_root/build_artifacts/kiwisolver_1657953088445/work libclang==12.0.0 locket @ file:///home/conda/feedstock_root/build_artifacts/locket_1650660393415/work Markdown==3.3.6 markdown-it-py @ file:///home/conda/feedstock_root/build_artifacts/markdown-it-py_1677100944732/work matplotlib @ file:///home/conda/feedstock_root/build_artifacts/matplotlib-suite_1661439848456/work mdurl @ file:///home/conda/feedstock_root/build_artifacts/mdurl_1704317613764/work munkres==1.1.4 ndx-pose==0.1.1 networkx @ file:///home/conda/feedstock_root/build_artifacts/networkx_1646092782768/work nixio==1.5.3 numpy==1.19.5 oauthlib==3.1.1 opencv-python-headless==4.2.0.34 opt-einsum==3.3.0 packaging==21.3 pandas==1.3.5 partd @ file:///home/conda/feedstock_root/build_artifacts/partd_1695667515973/work patsy @ file:///home/conda/feedstock_root/build_artifacts/patsy_1703606105319/work Pillow @ file:///home/conda/feedstock_root/build_artifacts/pillow_1660385854171/work pkgutil_resolve_name==1.3.10 protobuf==3.16.0 psutil @ file:///home/conda/feedstock_root/build_artifacts/psutil_1666155398032/work pyasn1==0.4.8 pyasn1-modules==0.2.8 Pygments @ file:///home/conda/feedstock_root/build_artifacts/pygments_1700607939962/work pykalman==0.9.5 pynwb==2.3.3 pyparsing==3.0.6 pyrsistent==0.19.3 PySide2==5.13.2 python-dateutil @ file:///home/conda/feedstock_root/build_artifacts/python-dateutil_1626286286081/work python-rapidjson @ file:///home/conda/feedstock_root/build_artifacts/python-rapidjson_1665999896718/work pytz @ file:///home/conda/feedstock_root/build_artifacts/pytz_1693930252784/work PyWavelets @ file:///home/conda/feedstock_root/build_artifacts/pywavelets_1649616401885/work PyYAML @ file:///home/conda/feedstock_root/build_artifacts/pyyaml_1648757092905/work pyzmq @ file:///home/conda/feedstock_root/build_artifacts/pyzmq_1663830492333/work qimage2ndarray==1.10.0 QtPy @ file:///home/conda/feedstock_root/build_artifacts/qtpy_1698112029416/work requests==2.26.0 requests-oauthlib==1.3.0 rich @ file:///home/conda/feedstock_root/build_artifacts/rich-split_1700160075651/work/dist rsa==4.8 ruamel.yaml==0.17.32 ruamel.yaml.clib==0.2.7 scikit-image @ file:///home/conda/feedstock_root/build_artifacts/scikit-image_1645196656256/work scikit-learn @ file:///home/conda/feedstock_root/build_artifacts/scikit-learn_1632611341839/work scikit-video==1.1.11 scipy @ file:///home/conda/feedstock_root/build_artifacts/scipy_1637806658031/work seaborn @ file:///home/conda/feedstock_root/build_artifacts/seaborn-split_1672497695270/work segmentation-models==1.0.1 setuptools-scm==6.3.2 Shapely @ file:///home/conda/feedstock_root/build_artifacts/shapely_1665624546039/work shiboken2==5.13.2 six @ file:///home/conda/feedstock_root/build_artifacts/six_1620240208055/work sleap==1.3.3 statsmodels @ file:///home/conda/feedstock_root/build_artifacts/statsmodels_1654787101575/work tensorboard==2.7.0 tensorboard-data-server==0.6.1 tensorboard-plugin-wit==1.8.0 tensorflow==2.7.0 tensorflow-estimator==2.7.0 tensorflow-hub @ file:///home/conda/feedstock_root/build_artifacts/tensorflow-hub_1678880940235/work/wheel_dir/tensorflow_hub-0.13.0-py2.py3-none-any.whl tensorflow-io-gcs-filesystem==0.23.1 termcolor==1.1.0 threadpoolctl @ file:///home/conda/feedstock_root/build_artifacts/threadpoolctl_1643647933166/work tifffile @ file:///home/conda/feedstock_root/build_artifacts/tifffile_1635944860688/work tomli==2.0.0 toolz @ file:///home/conda/feedstock_root/build_artifacts/toolz_1657485559105/work typing_extensions==4.0.1 tzlocal==5.0.1 unicodedata2 @ file:///home/conda/feedstock_root/build_artifacts/unicodedata2_1649111917568/work urllib3==1.26.7 Werkzeug==2.0.2 wrapt==1.13.3 zipp==3.15.0
# paste output of `pip freeze` or `conda list` here
Logs
$ sleap-label
Saving config: /home/dprotter/.sleap/1.3.3/preferences.yaml
Restoring GUI state...
2024-01-04 16:13:11.458294: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:939] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2024-01-04 16:13:11.465848: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:939] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2024-01-04 16:13:11.466161: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:939] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero

Software versions:
SLEAP: 1.3.3
TensorFlow: 2.7.0
Numpy: 1.19.5
Python: 3.7.12
OS: Linux-5.15.0-78-generic-x86_64-with-debian-bullseye-sid

Happy SLEAPing! :)
Empty filename passed to function
Traceback (most recent call last):
  File "/home/dprotter/anaconda3/envs/sleap/lib/python3.7/site-packages/sleap/gui/commands.py", line 309, in importDLC
    self.execute(ImportDeepLabCut)
  File "/home/dprotter/anaconda3/envs/sleap/lib/python3.7/site-packages/sleap/gui/commands.py", line 242, in execute
    command().execute(context=self, params=kwargs)
  File "/home/dprotter/anaconda3/envs/sleap/lib/python3.7/site-packages/sleap/gui/commands.py", line 138, in execute
    self.do_with_signal(context, params)
  File "/home/dprotter/anaconda3/envs/sleap/lib/python3.7/site-packages/sleap/gui/commands.py", line 162, in do_with_signal
    cls.do_action(context, params)
  File "/home/dprotter/anaconda3/envs/sleap/lib/python3.7/site-packages/sleap/gui/commands.py", line 940, in do_action
    labels = Labels.load_deeplabcut(filename=params["filename"])
  File "/home/dprotter/anaconda3/envs/sleap/lib/python3.7/site-packages/sleap/io/dataset.py", line 2192, in load_deeplabcut
    return read(filename, for_object="labels", as_format="deeplabcut")
  File "/home/dprotter/anaconda3/envs/sleap/lib/python3.7/site-packages/sleap/io/format/main.py", line 103, in read
    return disp.read(filename, *args, **kwargs)
  File "/home/dprotter/anaconda3/envs/sleap/lib/python3.7/site-packages/sleap/io/format/dispatch.py", line 57, in read
    return adaptor.read(file, *args, **kwargs)
  File "/home/dprotter/anaconda3/envs/sleap/lib/python3.7/site-packages/sleap/io/format/deeplabcut.py", line 78, in read
    file=file, full_video=full_video, *args, **kwargs
  File "/home/dprotter/anaconda3/envs/sleap/lib/python3.7/site-packages/sleap/io/format/deeplabcut.py", line 186, in read_frames
    data[(animal_name, node, "x")][i],
  File "/home/dprotter/anaconda3/envs/sleap/lib/python3.7/site-packages/pandas/core/frame.py", line 3457, in __getitem__
    return self._getitem_multilevel(key)
  File "/home/dprotter/anaconda3/envs/sleap/lib/python3.7/site-packages/pandas/core/frame.py", line 3508, in _getitem_multilevel
    loc = self.columns.get_loc(key)
  File "/home/dprotter/anaconda3/envs/sleap/lib/python3.7/site-packages/pandas/core/indexes/multi.py", line 2932, in get_loc
    return self._engine.get_loc(key)
  File "pandas/_libs/index.pyx", line 725, in pandas._libs.index.BaseMultiIndexCodesEngine.get_loc
  File "pandas/_libs/index.pyx", line 76, in pandas._libs.index.IndexEngine.get_loc
  File "pandas/_libs/index.pyx", line 108, in pandas._libs.index.IndexEngine.get_loc
  File "pandas/_libs/hashtable_class_helper.pxi", line 1832, in pandas._libs.hashtable.UInt64HashTable.get_item
  File "pandas/_libs/hashtable_class_helper.pxi", line 1841, in pandas._libs.hashtable.UInt64HashTable.get_item
KeyError: 716

Screenshots

How to reproduce

  1. Move maDLC project folder to local machine
  2. Open SLEAP with $sleap-label
  3. Scroll to File -> import -> DeepLabCut Dataset
  4. Select .yaml file, or .csv file for training data (both gave the same keyerror, "716")
@dprotter dprotter added the bug Something isn't working label Jan 5, 2024
@dprotter
Copy link
Author

dprotter commented Jan 11, 2024

I have made some progress on this front. We are using an "identity = True" project here, where 1 animal has a couple of identifying marks. These are labeled only on a single animal, so one section of the incoming datastructure has the animal label "single". Those values get added as nodes, but arent present in any other animal, so they return as a key error.

The two solutions I have had are

  1. Simply deleting those columns
  2. duplicating those columns, assigning them to the animal in which they correspond for that training data, and then duplicating the header for the other animals in the dataset but leaving the actual positional data blank.

That seems to have solved the issue. However, I will say I was using the wrong CSV file, and instead using the one locate in .../training_datasets/iteration.../.../

I think it would be helpful to add the correct target location to the documentation ( .../labeled-data/... ), as there are a few CSV files in a DLC project.

@shrivaths16 shrivaths16 added the fixed in future release Fix or feature is merged into develop and will be available in future release. label Jan 31, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working fixed in future release Fix or feature is merged into develop and will be available in future release.
Projects
None yet
Development

No branches or pull requests

2 participants