Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bigquery: import error with v1.24.0 #99

Closed
sagydr opened this issue May 7, 2020 · 31 comments
Closed

Bigquery: import error with v1.24.0 #99

sagydr opened this issue May 7, 2020 · 31 comments
Assignees
Labels
api: bigquery Issues related to the googleapis/python-bigquery API. needs more info This issue needs more information from the customer to proceed. type: question Request for information or clarification. Not an issue.

Comments

@sagydr
Copy link

sagydr commented May 7, 2020

bug googleapis/google-cloud-python#9965 is still happening in v1.24.0 and six v1.14.0

`File "/root/.local/share/virtualenvs/code-788z9T0p/lib/python3.6/site-packages/google/cloud/bigquery/schema.py", line 17, in

from six.moves import collections_abc
ImportError: cannot import name 'collections_abc'
`

why did you close the googleapis/google-cloud-python#9965 issue if it still reproduces for many people?

image

@plamut plamut transferred this issue from googleapis/google-cloud-python May 8, 2020
@product-auto-label product-auto-label bot added the api: bigquery Issues related to the googleapis/python-bigquery API. label May 8, 2020
@plamut plamut added the type: question Request for information or clarification. Not an issue. label May 8, 2020
@plamut
Copy link
Contributor

plamut commented May 8, 2020

@sagydr The issue was closed, when the six dependency was pinned to >=1.13.0. I also tried installing bigquery into a fresh Python 3.6 environment and it worked fine.

$ python3.6 -m venv venv-3.6
$ . venv-3.6/bin/activate
$ pip install google-cloud-bigquery
$ python
Python 3.6.9 (default, Apr 18 2020, 01:56:04) 
[GCC 8.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from google.cloud import bigquery  # no error
>>> exit()
$ pip freeze
cachetools==4.1.0
certifi==2020.4.5.1
chardet==3.0.4
google-api-core==1.17.0
google-auth==1.14.2
google-cloud-bigquery==1.24.0
google-cloud-core==1.3.0
google-resumable-media==0.5.0
googleapis-common-protos==1.51.0
idna==2.9
pkg-resources==0.0.0
protobuf==3.11.3
pyasn1==0.4.8
pyasn1-modules==0.2.8
pytz==2020.1
requests==2.23.0
rsa==4.0
six==1.14.0
urllib3==1.25.9

Since the installed package versions (six, specifically) are correct, it appears that there's something in your environment that is not what it seems to be.

Could you please double check that six== 1.13.0+ is indeed installed? What is the output of the following?

$ pip freeze

Additionally, If you add the following to your application code (before importing bigquery), what does it print out?

import six
print(six.__version__)

It might be that the environment used to actually run the application is different from the one pip freeze check was run on, which would explain the difference in behavior.

@plamut
Copy link
Contributor

plamut commented May 13, 2020

@sagydr Any luck with solving the issue? Is there anything else we can help you with?

@ZIYUNCHEN
Copy link

I have tried bigquery 1.23 and 1.24 with six 1.14. Both still showing the same error

google-cloud-bigquery==1.23.0
six==1.14.0

google-api-core==1.17.0
google-auth==1.14.3
google-cloud-core==1.3.0

@plamut
Copy link
Contributor

plamut commented May 14, 2020

@ZIYUNCHEN Just to be sure, what does print(six.__version__) say if you put it right before the line where bigquery is imported? Is the version actually used by the code indeed 1.13.0+?

I tried quite a few combinations multiple times, but did not manage to reproduce the reported import error with an updated version of six.

@gosselinmarcantoine
Copy link

gosselinmarcantoine commented May 14, 2020

I am also facing this issue. I did a pip install --upgrade six==1.13.0 but the print(six.__version__) still shows 1.12.0. What would be the way to fix this?

@plamut
Copy link
Contributor

plamut commented May 15, 2020

@gosselinmarcantoine The most likely reason is that the six was upgraded in a different Python environment than the one actually used to run the application.

You can try with determining which pip was used to upgrade six from the command line:

$ which pip

and printing out the six module path used in the code:

print(six.__file__)

This could give a clue whether indeed different Python environments are used.

@meredithslota meredithslota added the needs more info This issue needs more information from the customer to proceed. label May 20, 2020
@juanroesel
Copy link

juanroesel commented May 22, 2020

I am also facing this issue. I did a pip install --upgrade six==1.13.0 but the print(six.__version__) still shows 1.12.0. What would be the way to fix this?

I was having the same issue, and when followed @plamut instructions I indeed realized that the 'six' package that was being loaded into my Python Environment was version 1.11, while the one I had installed in my virtual environment was version 1.14.

I'm using PyCharm, and was able to manually upgrade the 'six' package to the latest version (1.15) by going to Preferences, clicking on the 'six' package and executing an update in the Project Interpreter window. Don't forget to select the version you would like to get.

After this update, BigQuery (v.1.24.0) now runs smoothly in my Python Environment.

@anistark
Copy link
Contributor

anistark commented Oct 6, 2020

Still facing this issue. My jenkins deployment pipeline broke with the same message.

Packages on my server:

google-cloud-bigquery==2.0.0
six==1.15.0

Also, checked:

>>> print(six.__file__)
/<local_path>/venv/lib/python3.8/site-packages/six.py
>>> print(six.__version__)
1.15.0

Please advice.

@HemangChothani
Copy link
Contributor

@anistark Could you please share the stack trace or it's same as mentioned above.

@anistark
Copy link
Contributor

anistark commented Oct 6, 2020

@HemangChothani It's the same as mentioned:

"    from google.cloud import bigquery", 
"  File \"/home/deploy/src/lib/python3.6/site-packages/google/cloud/bigquery/__init__.py\", line 35, in <module>", 
"    from google.cloud.bigquery.client import Client", 
"  File \"/home/deploy/src/lib/python3.6/site-packages/google/cloud/bigquery/client.py\", line 57, in <module>", 
"    from google.cloud.bigquery import _pandas_helpers", 
"  File \"/home/deploy/src/lib/python3.6/site-packages/google/cloud/bigquery/_pandas_helpers.py\", line 36, in <module>", 
"    from google.cloud.bigquery import schema", 
"  File \"/home/deploy/src/lib/python3.6/site-packages/google/cloud/bigquery/schema.py\", line 17, in <module>", 
"    from six.moves import collections_abc", 
"ImportError: cannot import name 'collections_abc'", 

@HemangChothani
Copy link
Contributor

@anistark I think you need to verify again of you six library path and google-cloud-bigquery path, because google-cloud-bigquery installed in python3.6/site-package where six is installed for venv/python3.8/site-package.

@anistark
Copy link
Contributor

anistark commented Oct 6, 2020

@HemangChothani

The 3.8 one was from my local and 3.6 one was from my server.

@HemangChothani
Copy link
Contributor

@anistark Could you please try with the fresh environment locally , i am not able to reproduce this error.
$ python3.6 -m venv venv-3.6
$ . venv-3.6/bin/activate
$ pip install google-cloud-bigquery or pip3 install google-cloud-bigquery

if still error occurs then please share the stack trace, six.file path and list of dependencies which you can get through pip freeze command

@anistark
Copy link
Contributor

anistark commented Oct 6, 2020

Now, I'm getting a new error. I uninstalled and installed again google-cloud-bigquery with six 1.15.0

File "/.../venv/lib/python3.8/site-packages/google/protobuf/internal/python_message.py", line 570, in _GetFieldByName
    return message_descriptor.fields_by_name[field_name]
KeyError: 'proto3_optional'

Any ideas?

    from google.cloud import bigquery
  File "/.../venv/lib/python3.8/site-packages/google/cloud/bigquery/__init__.py", line 35, in <module>
    from google.cloud.bigquery.client import Client
  File "/.../venv/lib/python3.8/site-packages/google/cloud/bigquery/client.py", line 57, in <module>
    from google.cloud.bigquery import _pandas_helpers
  File "/.../venv/lib/python3.8/site-packages/google/cloud/bigquery/_pandas_helpers.py", line 36, in <module>
    from google.cloud.bigquery import schema
  File "/.../venv/lib/python3.8/site-packages/google/cloud/bigquery/schema.py", line 19, in <module>
    from google.cloud.bigquery_v2 import types
  File "/.../venv/lib/python3.8/site-packages/google/cloud/bigquery_v2/__init__.py", line 19, in <module>
    from .types.encryption_config import EncryptionConfiguration
  File "/.../venv/lib/python3.8/site-packages/google/cloud/bigquery_v2/types/__init__.py", line 18, in <module>
    from .encryption_config import EncryptionConfiguration
  File "/.../venv/lib/python3.8/site-packages/google/cloud/bigquery_v2/types/encryption_config.py", line 29, in <module>
    class EncryptionConfiguration(proto.Message):
  File "/.../venv/lib/python3.8/site-packages/proto/message.py", line 215, in __new__
    field=[i.descriptor for i in fields],
  File "/.../venv/lib/python3.8/site-packages/proto/message.py", line 215, in <listcomp>
    field=[i.descriptor for i in fields],
  File "/.../venv/lib/python3.8/site-packages/proto/fields.py", line 104, in descriptor
    self._descriptor = descriptor_pb2.FieldDescriptorProto(
  File "/.../venv/lib/python3.8/site-packages/google/protobuf/internal/python_message.py", line 509, in init
    field = _GetFieldByName(message_descriptor, field_name)
  File "/.../venv/lib/python3.8/site-packages/google/protobuf/internal/python_message.py", line 572, in _GetFieldByName
    raise ValueError('Protocol message %s has no "%s" field.' %
ValueError: Protocol message FieldDescriptorProto has no "proto3_optional" field.

@HemangChothani
Copy link
Contributor

@anistark please share pip freeze list

@anistark
Copy link
Contributor

anistark commented Oct 6, 2020

Here's the partial pip freeze output:

gcloud==0.17.0
geographiclib==1.50
geopy==1.21.0
google-api-core==1.22.2
google-api-python-client==1.12.3
google-auth==1.22.0
google-auth-httplib2==0.0.4
google-cloud-bigquery==2.0.0
google-cloud-core==1.4.1
google-cloud-firestore==1.9.0
google-cloud-logging==1.15.1
google-cloud-storage==1.31.2
google-cloud-translate==3.0.1
google-crc32c==1.0.0
google-resumable-media==1.0.0
googleapis-common-protos==1.52.0
googlemaps==4.4.2
graphviz==0.12
grpcio==1.32.0

proto-plus==1.10.0
protobuf==3.8.0

six==1.15.0

@HemangChothani
Copy link
Contributor

Update protobuf to protobuf==3.11.3 and try it again.

@HemangChothani
Copy link
Contributor

@anistark Is your issue resolved?

@anistark
Copy link
Contributor

anistark commented Oct 6, 2020

Not really @HemangChothani. Still coming even with protobuf==3.11.3 on local.

@tswast
Copy link
Contributor

tswast commented Oct 6, 2020

The generator is using protobuf 3.12.0, so that is the minimum version needed. See: googleapis/python-api-core#48

@anistark
Copy link
Contributor

anistark commented Oct 6, 2020

Finally. Thanks @tswast for that info. Thanks @HemangChothani for staying with me through that. Will try deploying on server now. 🤞🏼

[Update]: This works. Needs to go on documentation or add dependency for protobuf ^3.12.0.

@anistark
Copy link
Contributor

anistark commented Oct 7, 2020

Creating an issue for this for future references: #305

@formigone
Copy link

I'm still getting this error even after installing protobuf==3.12.0

Traceback (most recent call last):
  File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/models/dagbag.py", line 204, in process_file
    m = imp.load_source(mod_name, filepath)
  File "/opt/bitnami/airflow/venv/lib/python3.6/imp.py", line 172, in load_source
    module = _load(spec)
  File "<frozen importlib._bootstrap>", line 684, in _load
  File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/opt/bitnami/airflow/dags/my-dag.py", line 16, in <module>
    from google.cloud import bigquery
  File "/opt/bitnami/python/lib/python3.6/site-packages/google/cloud/bigquery/__init__.py", line 35, in <module>
    from google.cloud.bigquery.client import Client
  File "/opt/bitnami/python/lib/python3.6/site-packages/google/cloud/bigquery/client.py", line 57, in <module>
    from google.cloud.bigquery import _pandas_helpers
  File "/opt/bitnami/python/lib/python3.6/site-packages/google/cloud/bigquery/_pandas_helpers.py", line 36, in <module>
    from google.cloud.bigquery import schema
  File "/opt/bitnami/python/lib/python3.6/site-packages/google/cloud/bigquery/schema.py", line 19, in <module>
    from google.cloud.bigquery_v2 import types
  File "/opt/bitnami/python/lib/python3.6/site-packages/google/cloud/bigquery_v2/__init__.py", line 19, in <module>
    from .types.encryption_config import EncryptionConfiguration
  File "/opt/bitnami/python/lib/python3.6/site-packages/google/cloud/bigquery_v2/types/__init__.py", line 18, in <module>
    from .encryption_config import EncryptionConfiguration
  File "/opt/bitnami/python/lib/python3.6/site-packages/google/cloud/bigquery_v2/types/encryption_config.py", line 29, in <module>
    class EncryptionConfiguration(proto.Message):
  File "/opt/bitnami/python/lib/python3.6/site-packages/proto/message.py", line 215, in __new__
    field=[i.descriptor for i in fields],
  File "/opt/bitnami/python/lib/python3.6/site-packages/proto/message.py", line 215, in <listcomp>
    field=[i.descriptor for i in fields],
  File "/opt/bitnami/python/lib/python3.6/site-packages/proto/fields.py", line 111, in descriptor
    proto3_optional=self.optional,
ValueError: Protocol message FieldDescriptorProto has no "proto3_optional" field.

Pip freeze:

pip freeze       
WARNING: The directory '/.cache/pip' or its parent directory is not owned or is not writable by the current user. The cache has been disabled. Check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
appdirs==1.4.4
attrs==20.2.0
Authlib==0.15
authorizenet==1.1.3
boto3==1.15.16
botocore==1.18.16
cached-property==1.5.2
cachetools==4.1.1
certifi==2020.6.20
cffi==1.14.3
chardet==3.0.4
cryptography==3.1.1
dataclasses==0.7
defusedxml==0.6.0
docopt==0.4.0
docutils==0.15.2
ecdsa==0.14.1
enum-compat==0.0.3
future==0.18.2
google-api-core==1.22.4
google-auth==1.22.1
google-auth-oauthlib==0.4.1
google-cloud-bigquery==2.1.0
google-cloud-bigquery-storage==2.0.0
google-cloud-core==1.4.3
google-cloud-storage==1.31.2
google-crc32c==1.0.0
google-resumable-media==1.1.0
googleads==22.0.0
googleapis-common-protos==1.52.0
grpcio==1.32.0
idna==2.10
intuit-oauth==1.2.2
isodate==0.6.0
jmespath==0.10.0
libcst==0.3.12
lxml==4.5.2
mandrill==1.0.57
mypy-extensions==0.4.3
numpy==1.19.2
oauthlib==3.1.0
pandas==0.25.3
pandas-gbq==0.14.0
proto-plus==1.10.1
protobuf==3.12.0
pyarrow==1.0.1
pyasn1==0.4.8
pyasn1-modules==0.2.8
pycparser==2.18
pydata-google-auth==1.1.0
pymongo==3.7.2
python-bsonstream==0.1.3
python-dateutil==2.8.1
python-jose==3.2.0
python-quickbooks==0.8.1
pytz==2020.1
PyXB==1.2.5
PyYAML==5.3.1
rauth==0.7.3
requests==2.24.0
requests-file==1.5.1
requests-oauthlib==1.3.0
requests-toolbelt==0.9.1
rsa==4.6
s3transfer==0.3.3
simple-salesforce==1.0.0
simplejson==3.17.0
six==1.15.0
stripe==2.38.0
typing-extensions==3.7.4.3
typing-inspect==0.6.0
urllib3==1.25.10
virtualenv==16.7.6
xmltodict==0.12.0
zeep==4.0.0

@shska
Copy link

shska commented Feb 8, 2021

I got this issue and fixed it by run this command "pip install protobuf==3.14.0".

@parnell
Copy link

parnell commented Mar 5, 2021

I also had the same error and but protobuf version 3.14.0 didn't solve it. Version 3.15.0 and above seem to have worked, 3.15.5 in my case.
pip install -U protobuf

@YohanObadia
Copy link

Same error with protobuf 3.6

@meredithslota
Copy link
Contributor

Hello @YohanObadia — can you please file a new issue (cross-reference this one if needed) describing what you're experiencing? Please include relevant package version details. I will ask @plamut to look into it as a follow-up to this item. Thank you!

@plamut
Copy link
Contributor

plamut commented May 11, 2021

Sorry for missing the last few comments, it's an old closed issue.

Import errors like these often indicate that not all dependencies are up to date, and upgrading them usually resolves the problem. If that's not the case, then yes, please do open a new issue with all the details, and we'll have a look.

@bhajesh2
Copy link

When you see the error like this, it also says that the bigquery version is not the right version for the OS where you installed it.

I had similar issue in my OS version (Ubuntu 16.0) and when I degraded my bigquery version from 2.4.0 to 1.10.1. The issue resolved.

@Bapi-Reddy
Copy link

When you see the error like this, it also says that the bigquery version is not the right version for the OS where you installed it.

I had similar issue in my OS version (Ubuntu 16.0) and when I degraded my bigquery version from 2.4.0 to 1.10.1. The issue resolved.

This worked thanks

@TheMightyRaider
Copy link

Upgrading protobuf to version 3.20.3 worked for me

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: bigquery Issues related to the googleapis/python-bigquery API. needs more info This issue needs more information from the customer to proceed. type: question Request for information or clarification. Not an issue.
Projects
None yet
Development

No branches or pull requests