Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Node Client - load from file throwing error #136

Open
AlexLeonte opened this issue Jan 17, 2023 · 4 comments · May be fixed by #189
Open

Node Client - load from file throwing error #136

AlexLeonte opened this issue Jan 17, 2023 · 4 comments · May be fixed by #189

Comments

@AlexLeonte
Copy link

Load from file not working while using nodejs client.

href: 'http://0.0.0.0:9050/upload/bigquery/v2/projects//jobs?uploadType=multipart'
message: 'runtime error: invalid memory address or nil pointer dereference'

@goccy
Copy link
Owner

goccy commented Jan 17, 2023

@AlexLeonte Since it is not possible to investigate from this information, please include the version of the emulator you used and the reproduction code.

@AlexLeonte
Copy link
Author

AlexLeonte commented Jan 17, 2023

async function createTable() {
  const bigquery = new BigQuery({
    apiEndpoint: 'http://0.0.0.0:9050/',
    projectId: 'gccstart'
  })
  
  const datasetId = 'my_dataset';
  const tableId = 'my_table';
  
  const options = {
    location: 'US'
  }

  // Create a new table in the dataset
  const [table] = await bigquery
    .dataset(datasetId)
    .createTable(tableId, options)

  console.log(Table ${table.id} created.);
}
async function createDataset(){
  const bigquery = new BigQuery({
    apiEndpoint: 'http://0.0.0.0:9050',
    projectId: 'gccstart'
  })

  const datasetId = 'my_dataset'

  const [dataset] = await bigquery.createDataset(datasetId)
  console.log(`Dataset ${dataset.id} created.`)
}
async function populateTable() {
  const bigquery = new BigQuery({
    apiEndpoint: 'http://0.0.0.0:9050',
    projectId: 'gccstart'
  })

  const datasetId = 'my_dataset';
  const tableId = 'my_table';

  const metadata = {
    sourceFormat: 'NEWLINE_DELIMITED_JSON',
    schema: {
        fields: [
        {name: 'name', type: 'STRING'},
        {name: 'id', type: 'STRING'},
        ],
    },
    location: 'US',
  };

  const filePath = path.join('bqvalues.json')
  await bigquery
    .dataset(datasetId)
    .table(tableId)
    .load(filePath, metadata)
}

bqvalues.json
{ "name":"Alice","id":"12"} { "name":"Bob","id":"34"} { "name":"Charles","id":"45"}

and for auth (GOOGLE_APPLICATION_CREDENTIALS) a valid account is used.

@sklakken
Copy link

sklakken commented Feb 24, 2023

@goccy I'm running into the exact same issue using the 'latest' docker image as of 2/24/2023.

  await bigquery.dataset(datasetId).table(tableId).insert([{
    policy: 'test',
    uuid: 'uuid',
    timestamp: new Date(),
    result: {
      id: 'result'
    },
    input: {
      id: 'input'
    },
  }])

Update...
The JSON datatype is not working. As a workaround, I changed the JSON datatype to STRING. This will be changed when working on GCP. Hopefully, the JSON datatype will be supported soon. :)

From:
    fields.push({name: 'result'   , type: 'JSON',   mode: 'REQUIRED'})
    fields.push({name: 'input'    , type: 'JSON',   mode: 'REQUIRED'})

    const result = { result: 'result'}
    const input  = { input: 'input'}
    
    const decision = {
      policy: 'test',
      uuid: uuidv4(),
      timestamp: new Date(),
      result: result,
      input: input
    }   

To:

    fields.push({name: 'result'   , type: 'STRING',   mode: 'REQUIRED'})
    fields.push({name: 'input'    , type: 'STRING',   mode: 'REQUIRED'})

    const result = JSON.stringify({ result: 'result'})
    const input  = JSON.stringify({ input: 'input'})

    const decision = {
      policy: 'test',
      uuid: uuidv4(),
      timestamp: new Date(),
      result: result,
      input: input
    }   

@SebHeuze
Copy link

I have the same problem, hope it will be supported :)

@aerben aerben linked a pull request May 12, 2023 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants