Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Installation issues in Colab #525

Open
Vivdaddy opened this issue Sep 15, 2021 · 2 comments
Open

Installation issues in Colab #525

Vivdaddy opened this issue Sep 15, 2021 · 2 comments

Comments

@Vivdaddy
Copy link

Issue: Bug
Version: Commit: a7c0965
Command Line output is in colab link below

Hi,
I wanted to run the notebook https://github.com/google/uncertainty-baselines/blob/b3686f75a10b1990c09b8eb589657090b8837d2c/baselines/notebooks/Hyperparameter_Ensembles.ipynb in Colab to check the performance of the hyperparameter ensemble method.

Colab link: https://colab.research.google.com/drive/1mA6LX6P3p2O31TM9bSkqMwoIxlVRUWqh?usp=sharing

Therefore, in colab I ran the following command so that all of the packages were installed properly.
!pip install "git+https://github.com/google/uncertainty-baselines.git#egg=uncertainty_baselines"

When I started running the notebook sequentially, in cell 5, I got ModuleNotFoundError: No module named 'robustness_metrics'. After a quick search I installed robustness_metrics using pip install "git+https://github.com/google-research/robustness_metrics.git#egg=robustness_metrics" and tried again. Once again, I got a ModuleNotFoundError and installed edward2 with pip install edward2. After I handle that, once again I get ModuleNotFoundError for seqio which I duly installed.

This time however, the uncertainty_baselines module is not loaded in at all and it fails when I do test_dataset = ub.datasets.get( DATASET, split=tfds.Split.TEST).load(batch_size=BATCH_SIZE) and this seems to be due to dependency issues with flatbuffer versions. During the imports I get

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. tensorflow 2.6.0 requires flatbuffers~=1.12.0, but you have flatbuffers 2.0 which is incompatible. and subsequently get

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. tf-nightly 2.7.0.dev20210915 requires flatbuffers~=2.0, but you have flatbuffers 1.12 which is incompatible.

All I want to do is to run the jupyter notebook listed above and experiment with the hyperparameter ensembles technique. Could someone please direct me on how I can run the above jupyter notebook?

copybara-service bot pushed a commit to google-research/robustness_metrics that referenced this issue Sep 20, 2021
Nightly can often break which causes usage of Robustness Metrics itself to be unstable. Let's remove the explicit dependence and require users to manually install either the stable or nightly for now.

See also google/uncertainty-baselines#530 for Uncertainty Baselines and the GitHub issues raised about this (google/uncertainty-baselines#407, google/uncertainty-baselines#525).

PiperOrigin-RevId: 397788272
copybara-service bot pushed a commit to google-research/robustness_metrics that referenced this issue Sep 20, 2021
Nightly can often break which causes usage of Robustness Metrics itself to be unstable. Let's remove the explicit dependence and require users to manually install either the stable or nightly for now.

See also google/uncertainty-baselines#530 for Uncertainty Baselines and the GitHub issues raised about this (google/uncertainty-baselines#407, google/uncertainty-baselines#525).

PiperOrigin-RevId: 397788272
copybara-service bot pushed a commit to google-research/robustness_metrics that referenced this issue Sep 20, 2021
Nightly can often break which causes usage of Robustness Metrics itself to be unstable. Let's remove the explicit dependence and require users to manually install either the stable or nightly for now.

See also google/uncertainty-baselines#530 for Uncertainty Baselines and the GitHub issues raised about this (google/uncertainty-baselines#407, google/uncertainty-baselines#525).

PiperOrigin-RevId: 397788272
copybara-service bot pushed a commit to google-research/robustness_metrics that referenced this issue Sep 20, 2021
Nightly can often break which causes usage of Robustness Metrics itself to be unstable. Let's remove the explicit dependence and require users to manually install either the stable or nightly for now.

See also google/uncertainty-baselines#530 for Uncertainty Baselines and the GitHub issues raised about this (google/uncertainty-baselines#407, google/uncertainty-baselines#525).

PiperOrigin-RevId: 397788272
@burakisikli
Copy link

This issue still exists. Do you know about the situation?

@dusenberrymw
Copy link
Member

Hi! This should be fixed now as of e1d8f01. Can you try it again? If you still run into issues, please let us know!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants