Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Defective pip dependencies: numba, numpy and protobuf #1148

Open
MhhhxX opened this issue Jun 14, 2023 · 3 comments
Open

Defective pip dependencies: numba, numpy and protobuf #1148

MhhhxX opened this issue Jun 14, 2023 · 3 comments

Comments

@MhhhxX
Copy link

MhhhxX commented Jun 14, 2023

I installed lit as described in the section Install from source. Every command succeeded, also building the frontend with yarn.

Then I tried to run a demo from the examples module within the conda environment with that command: python -m lit_nlp.examples.penguin_demo --port=4321 --quickstart.
I received the following error:

Traceback (most recent call last):
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/home/user/lit/lit_nlp/examples/penguin_demo.py", line 16, in <module>
    from lit_nlp import dev_server
  File "/home/user/lit/lit_nlp/dev_server.py", line 21, in <module>
    from lit_nlp import app as lit_app
  File "/home/user/lit/lit_nlp/app.py", line 33, in <module>
    from lit_nlp.components import core
  File "/home/user/lit/lit_nlp/components/core.py", line 35, in <module>
    from lit_nlp.components import shap_explainer
  File "/home/user/lit/lit_nlp/components/shap_explainer.py", line 28, in <module>
    import shap
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/shap/__init__.py", line 12, in <module>
    from ._explanation import Explanation, Cohorts
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/shap/_explanation.py", line 12, in <module>
    from .utils._general import OpChain
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/shap/utils/__init__.py", line 1, in <module>
    from ._clustering import hclust_ordering, partition_tree, partition_tree_shuffle, delta_minimization_order, hclust
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/shap/utils/_clustering.py", line 4, in <module>
    from numba import jit
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/numba/__init__.py", line 43, in <module>
    from numba.np.ufunc import (vectorize, guvectorize, threading_layer,
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/numba/np/ufunc/__init__.py", line 3, in <module>
    from numba.np.ufunc.decorators import Vectorize, GUVectorize, vectorize, guvectorize
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/numba/np/ufunc/decorators.py", line 3, in <module>
    from numba.np.ufunc import _internal
SystemError: initialization of _internal failed without raising an exception

Do you have any ideas how to fix that?

@MhhhxX
Copy link
Author

MhhhxX commented Jun 27, 2023

I could fix the problem with downgrading the numpy package but then I run into another problem with the protobuf package:

Traceback (most recent call last):
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/home/user/lit/lit_nlp/examples/toxicity_demo.py", line 19, in <module>
    from lit_nlp.examples.datasets import classification
  File "/home/user/lit/lit_nlp/examples/datasets/classification.py", line 8, in <module>
    import tensorflow_datasets as tfds
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/__init__.py", line 43, in <module>
    import tensorflow_datasets.core.logging as _tfds_logging
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/core/__init__.py", line 22, in <module>
    from tensorflow_datasets.core import community
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/core/community/__init__.py", line 18, in <module>
    from tensorflow_datasets.core.community.huggingface_wrapper import mock_builtin_to_use_gfile
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/core/community/huggingface_wrapper.py", line 31, in <module>
    from tensorflow_datasets.core import dataset_builder
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/core/dataset_builder.py", line 34, in <module>
    from tensorflow_datasets.core import dataset_info
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/core/dataset_info.py", line 50, in <module>
    from tensorflow_datasets.core import splits as splits_lib
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/core/splits.py", line 34, in <module>
    from tensorflow_datasets.core import proto as proto_lib
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/core/proto/__init__.py", line 18, in <module>
    from tensorflow_datasets.core.proto import dataset_info_generated_pb2 as dataset_info_pb2  # pylint: disable=line-too-long
  File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/core/proto/dataset_info_generated_pb2.py", line 22, in <module>
    from google.protobuf.internal import builder as _builder
ImportError: cannot import name 'builder' from 'google.protobuf.internal' (/home/max/.conda/envs/lit-nlp/lib/python3.9/site-packages/google/protobuf/internal/__init__.py)

Downgrading and upgrading protobuf didn't help as it breaks other dependencies.

@MhhhxX
Copy link
Author

MhhhxX commented Jun 27, 2023

I could also solve the second problem by applying the steps suggested in that stackoverflow post.

@MhhhxX MhhhxX changed the title Error when running a demo Defective pip dependencies: numba, numpy and protobuf Jun 28, 2023
@RyanMullins
Copy link
Member

@MhhhxX Are you still running into this issue after our (somewhat) recent release? That release did a lot to pin down Python version numbers in dependencies to address breakages.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants