You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm encountering the same issue.
If I use a "return null" statement when I try to skip the document row I obtain the "schema can not be null" error.
Did anyone manage to resolve the issue?
Many thanks!
I'm encountering the same issue. If I use a "return null" statement when I try to skip the document row I obtain the "schema can not be null" error. Did anyone manage to resolve the issue? Many thanks!
Hi @britz89 . I have not found a fix but I found an alternative way to skip it.
I pull all the data then use a saved query to run and create a new Table from the import. I have the filter applied in that saved query
So if I understood correctly you are pulling the full collection, storing in a temp table and then in a subsequent step filtering the rows. Correct?
My requirement is to avoid a full copy of the collection, so I hope that this issue will be fixed otherwise I will have to find another way.
Thanks for your suggestion, btw!
So if I understood correctly you are pulling the full collection, storing in a temp table and then in a subsequent step filtering the rows. Correct? My requirement is to avoid a full copy of the collection, so I hope that this issue will be fixed otherwise I will have to find another way. Thanks for your suggestion, btw!
Yes that is what I am currently doing until it is fixed because I need a solution up. The other alternative I thought about is using a custom batch template and fixing the issue.
Related Template(s)
MongoDB-to-BigQuery
Template Version
v2
What happened?
I have function that checks if a field is true or not. If its true then it returns null to skip saving that document into BigQuery.
I have tried doing a return undefined, return "" and i keep getting the same issue which is
Below is a code snippet
I referred to the example stated in this link
https://cloud.google.com/dataflow/docs/guides/templates/create-template-udf#filter_events
The job was created using the google console and not via api or sdk.
Relevant log output
The text was updated successfully, but these errors were encountered: