You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
(NB: Not sure if this is an upstream bug or upstream intended behavior, but I was directed here for this issue!)
I have data spread across a number of files that is larger than the number of workers, but I had trouble using parallel_import to upload them to Myria. See the following queries:
https://rest.myria.cs.washington.edu:1776/query/query-70837 -- in this query, I've assigned five files to three workers, and get edu.washington.escience.myria.DbException: Query #70837.0 failed: ErrorCode: 0, SQLState: 42P07, Msg: ERROR: relation "public:adhoc:supertinyngramtest" already exists
Both queries have "argOverwriteTable": true, since at first I thought I was double-ingesting -- however, an earlier query where this was false also failed.
The text was updated successfully, but these errors were encountered:
Yep -- there must currently be a one-to-one correspondence between input URL and worker. This makes me sad, and will hopefully be fixed soon.
In the case of URLs>workers, it's just a matter of unioning the extra sources prior to the DbInsert operator. A workaround is to generate the JSON plan by hand and do this manually. Yuck.
(NB: Not sure if this is an upstream bug or upstream intended behavior, but I was directed here for this issue!)
I have data spread across a number of files that is larger than the number of workers, but I had trouble using parallel_import to upload them to Myria. See the following queries:
https://rest.myria.cs.washington.edu:1776/query/query-70837 -- in this query, I've assigned five files to three workers, and get
edu.washington.escience.myria.DbException: Query #70837.0 failed: ErrorCode: 0, SQLState: 42P07, Msg: ERROR: relation "public:adhoc:supertinyngramtest" already exists
https://rest.myria.cs.washington.edu:1776/query/query-70838 -- exactly the same query, except each file is assigned to a unique worker. This one runs successfully.
Both queries have
"argOverwriteTable": true
, since at first I thought I was double-ingesting -- however, an earlier query where this wasfalse
also failed.The text was updated successfully, but these errors were encountered: