New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HuggingFace Interface works on Colab but not on local jupyter setup #55
Comments
Hi I am assuming that you downloaded the colab notebook as is and tried to run it locally on jupyterlabs and it didnt work. Please correct me if I am wrong. Regardless,
What would help even more would be if you share a snapshot of your local notebook with the outputs of each cell up until things failed. I did a quick google search on your error and the following seem to have an extensive discussion:
Overall its an environment setup issue and should be solved relatively easily. Answers to these will help us isolate the issue. |
Thanks for sharing the links, I had checked those links, based on those reinstalled, clean installed and then deleted the clean install of the virtual environment. |
Hi - I have Ubuntu23.10 Mantic with a couple of GPUs, 3090 being the biggest one. I want to run a batch of Marathi - English Translation with IndicTrans2, when I try to run the
huggingface_interface
on my local machine with JupyterLab, it errors out with following error -`
ModuleNotFoundError Traceback (most recent call last)
Cell In[1], line 2
1 import torch
----> 2 from transformers import AutoModelForSeq2SeqLM, BitsAndBytesConfig
3 from IndicTransTokenizer import IndicProcessor, IndicTransTokenizer
5 BATCH_SIZE = 4
File ~/.local/lib/python3.11/site-packages/transformers/init.py:26
23 from typing import TYPE_CHECKING
25 # Check the dependencies satisfy the minimal versions required.
---> 26 from . import dependency_versions_check
27 from .utils import (
28 OptionalDependencyNotAvailable,
29 _LazyModule,
(...)
47 logging,
48 )
51 logger = logging.get_logger(name) # pylint: disable=invalid-name
File ~/.local/lib/python3.11/site-packages/transformers/dependency_versions_check.py:16
1 # Copyright 2020 The HuggingFace Team. All rights reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
(...)
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
15 from .dependency_versions_table import deps
---> 16 from .utils.versions import require_version, require_version_core
19 # define which module versions we always want to check at run time
20 # (usually the ones defined in
install_requires
in setup.py)21 #
22 # order specific notes:
23 # - tqdm must be checked before tokenizers
25 pkgs_to_check_at_runtime = [
26 "python",
27 "tqdm",
(...)
37 "pyyaml",
38 ]
File ~/.local/lib/python3.11/site-packages/transformers/utils/init.py:33
24 from .constants import IMAGENET_DEFAULT_MEAN, IMAGENET_DEFAULT_STD, IMAGENET_STANDARD_MEAN, IMAGENET_STANDARD_STD
25 from .doc import (
26 add_code_sample_docstrings,
27 add_end_docstrings,
(...)
31 replace_return_docstrings,
32 )
---> 33 from .generic import (
34 ContextManagers,
35 ExplicitEnum,
36 ModelOutput,
37 PaddingStrategy,
38 TensorType,
39 add_model_info_to_auto_map,
40 cached_property,
41 can_return_loss,
42 expand_dims,
43 find_labels,
44 flatten_dict,
45 infer_framework,
46 is_jax_tensor,
47 is_numpy_array,
48 is_tensor,
49 is_tf_symbolic_tensor,
50 is_tf_tensor,
51 is_torch_device,
52 is_torch_dtype,
53 is_torch_tensor,
54 reshape,
55 squeeze,
56 strtobool,
57 tensor_size,
58 to_numpy,
59 to_py_obj,
60 transpose,
61 working_or_temp_dir,
62 )
63 from .hub import (
64 CLOUDFRONT_DISTRIB_PREFIX,
65 HF_MODULES_CACHE,
(...)
91 try_to_load_from_cache,
92 )
93 from .import_utils import (
94 ACCELERATE_MIN_VERSION,
95 ENV_VARS_TRUE_AND_AUTO_VALUES,
(...)
200 torch_only_method,
201 )
File ~/.local/lib/python3.11/site-packages/transformers/utils/generic.py:442
438 return tuple(self[k] for k in self.keys())
441 if is_torch_available():
--> 442 import torch.utils._pytree as _torch_pytree
444 def _model_output_flatten(output: ModelOutput) -> Tuple[List[Any], "_torch_pytree.Context"]:
445 return list(output.values()), list(output.keys())
File ~/.local/lib/python3.11/site-packages/torch/utils/init.py:4
1 import os.path as _osp
2 import torch
----> 4 from .throughput_benchmark import ThroughputBenchmark
5 from .cpp_backtrace import get_cpp_backtrace
6 from .backend_registration import rename_privateuse1_backend, generate_methods_for_privateuse1_backend
File ~/.local/lib/python3.11/site-packages/torch/utils/throughput_benchmark.py:2
----> 2 import torch._C
5 def format_time(time_us=None, time_ms=None, time_s=None):
6 """Define time formatting."""
ModuleNotFoundError: No module named 'torch._C'
`
This same code works on Google Colab. I have around 7K articles to translate, Google colab does not stay up for that long and I would like to avoid paying for this.
The text was updated successfully, but these errors were encountered: