Skip to content

Releases: googleapis/python-bigquery-pandas

Version 0.14.1

10 Nov 17:16
ac2d2fe
Compare
Choose a tag to compare
  • Use object dtype for TIME columns. (#328)
  • Encode floating point values with greater precision. (#326)
  • Support INT64 and other standard SQL aliases in
    pandas_gbq.to_gbq table_schema argument. (#322)

https://pypi.org/project/pandas-gbq/0.14.1/

Version 0.14.0

05 Oct 21:21
0e3e3f0
Compare
Choose a tag to compare

0.14.0 / 2020-10-05

  • Add dtypes argument to read_gbq. Use this argument to override
    the default dtype for a particular column in the query results.
    For example, this can be used to select nullable integer columns as
    the Int64 nullable integer pandas extension type. (#242, #332)
df = pandas_gbq.read_gbq(
    "SELECT CAST(NULL AS INT64) AS null_integer",
    dtypes={"null_integer": "Int64"},
)

Dependency updates

  • Support google-cloud-bigquery-storage 2.0 and higher. (#329)
  • Update the minimum version of pandas to 0.20.1. (#331)

Internal changes

  • Update tests to run against Python 3.8. (#331)

Version 0.13.3

30 Sep 16:21
a9cd0fc
Compare
Choose a tag to compare
  • Include needed "extras" from google-cloud-bigquery package as
    dependencies. Exclude incompatible 2.0 version. (#324, #329)

PyPI

Version 0.13.1

13 Feb 17:09
e177978
Compare
Choose a tag to compare
  • Fix AttributeError with BQ Storage API to download empty results. (#299)

PyPI

Version 0.13.0

12 Dec 22:49
2897b81
Compare
Choose a tag to compare
  • Raise NotImplementedError when the deprecated private_key argument is used. (#301)

Version 0.12.0

25 Nov 22:22
9fb2464
Compare
Choose a tag to compare

New features

  • Add max_results argument to pandas_gbq.read_gbq(). Use this
    argument to limit the number of rows in the results DataFrame. Set
    max_results to 0 to ignore query outputs, such as for DML or DDL
    queries. (#102)
  • Add progress_bar_type argument to pandas_gbq.read_gbq(). Use
    this argument to display a progress bar when downloading data.
    (#182)

Dependency updates

  • Update the minimum version of google-cloud-bigquery to 1.11.1.
    (#296)

Documentation

  • Add code samples to introduction and refactor how-to guides. (#239)

Bug fixes

  • Fix resource leak with use_bqstorage_api by closing BigQuery Storage API client after use. (#294)

Release on PyPI

Version 0.11.0

29 Jul 20:06
9990047
Compare
Choose a tag to compare
  • Breaking Change: Python 2 support has been dropped. This is to align
    with the pandas package which dropped Python 2 support at the end of 2019.
    (#268)

Enhancements

  • Ensure table_schema argument is not modified inplace. (:issue:278)

Implementation changes

  • Use object dtype for STRING, ARRAY, and STRUCT columns when
    there are zero rows. (#285)

Internal changes

  • Populate user-agent with pandas version information. (#281)
  • Fix pytest.raises usage for latest pytest. Fix warnings in tests.
    (#282 )
  • Update CI to install nightly packages in the conda tests. (#254)

Version 0.10.0

05 Apr 13:37
f633fa9
Compare
Choose a tag to compare

Documentation

Dependency updates

  • Update the minimum version of google-cloud-bigquery to 1.9.0. ( #247 )
  • Update the minimum version of pandas to 0.19.0. ( #262 )

Internal changes

  • Update the authentication credentials. Note: You may need to set reauth=True in order to update your credentials to the most recent version. This is required to use new functionality such as the BigQuery Storage API. ( #267 )
  • Use to_dataframe() from google-cloud-bigquery in the read_gbq() function. ( #247 )

Enhancements

  • Fix a bug where pandas-gbq could not upload an empty DataFrame. ( #237 )
  • Allow table_schema in to_gbq to contain only a subset of columns, with the rest being populated using the DataFrame dtypes ( #218 ) (contributed by @JohnPaton)
  • Read project_id in to_gbq from provided credentials if available (contributed by @daureg)
  • read_gbq uses the timezone-aware DatetimeTZDtype(unit='ns', tz='UTC') dtype for BigQuery TIMESTAMP columns. ( #269 )
  • Add use_bqstorage_api to read_gbq. The BigQuery Storage API can be used to download large query results (>125 MB) more quickly. If the BQ Storage API can't be used, the BigQuery API is used instead. ( #133, #270 )

Version 0.9.0

11 Jan 17:55
b0254c4
Compare
Choose a tag to compare
  • Warn when deprecated private_key parameter is used. #240
  • New dependency Use the pydata-google-auth package for authentication. #241

PyPI

Version 0.8.0

12 Nov 09:13
398e75e
Compare
Choose a tag to compare

Breaking changes

  • Deprecate private_key parameter to pandas_gbq.read_gbq and pandas_gbq.to_gbq in favor of new credentials argument. Instead, create a credentials object using google.oauth2.service_account.Credentials.from_service_account_info or google.oauth2.service_account.Credentials.from_service_account_file. See the authentication how-to guide for examples. (#161, #231 )

Enhancements

  • Allow newlines in data passed to to_gbq. (#180)
  • Add pandas_gbq.context.dialect to allow overriding the default SQL syntax dialect. (#195, #235)
  • Support Python 3.7. (#197, #232)

Internal changes

  • Migrate tests to CircleCI. (#228, #232)