New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
to_gbq
fails to create empty table with correct schema
#376
Comments
Thanks for the report! I agree this looks like a bug. |
There are a couple of issues here:
In summary, the dataframe you give must have a column for every column you want, and the types in |
@jimfulton thanks for the reply but I'm a bit confused since I'm passing a schema with all my column names:
When you say "the dataframe you give must have a column for every column you want" ? Does this mean I have to provide the columns when passing the argument in pandas_gbq.to_gbq(dataframe=pd.DataFrame(columns=[list of col names]), ...)
Goot to know, I will give this a try. |
You're passing an empty dataframe,
No, the dataframe you pass in must have the columns you want. You could use Alternatively, if you had a list of column names, you could do something like
|
You want something like:
|
@tswast I've confirmed that the suggestions made by @jimfulton work, and this is a problem of documentation. Providing a columns list and having the types of the schema be BIgQuery types strings solves the problem. |
Thanks for the update. I'll close this once we clarify the docs. |
Closed by #383 The
|
I want to create an empty table that has a specific schema, which in this specific case I'm inferring from a dask dataframe. The table gets created on BigQuery, however it shows with no schema.
Minimal reproducible example:
But when I check the schema on the table created I get:
Environment details
The text was updated successfully, but these errors were encountered: