Skip to content

Commit

Permalink
test: E2E test for time encoding (#941)
Browse files Browse the repository at this point in the history
* Place Holder Stress Test for Setup

* Making a basic long running test for the Write API

* Simple stress test placeholder in folder st.

* Add simple time caclulation

* Cleaning up before submitting pull

* caching changes

* Cleaning up and adding TODO

* Added copywrite info at beginning of file

* Removing error causing lines

* Moving Before class into single test to fix permissions issues

* Ran mvn format, should fix lint errors

* Resolving comments, removing unneccsary code

* Fixing comments and cleaning up

* Moved creation of client back into BeforeClass and resolved some other comments

* Formating fix

* Changed name from ST to IT to fix errors

* Formatting

* Aggregating data and logging once

* Refactoring down to only the simple case. Complex case will be handled in a different PR

* Quick rename

* Adding complex schema default stream test

* Formatting

* Adding Time Encoding Integration Test Placeholder

* Removing Stress test from this branch and moving it to the appropriate branch

* Add integration test to make sure that encoding and decoding across a table insertion holds up

* Added Integration test, renamed functions and got rid of redundant functions

* Fix License Header

* Java Lang set to 8 in order to use Java Local Time

* Removing nano functions, cleaning up comments

* Added round trip test to unit tests

* Moving to threeten time instead of java time

* Removing Java Time Dependency

* Lint

* Adding Time Encoding Integration Test Placeholder

* Removing Stress test from this branch and moving it to the appropriate branch

* Add integration test to make sure that encoding and decoding across a table insertion holds up

* Added Integration test, renamed functions and got rid of redundant functions

* Fix License Header

* Java Lang set to 8 in order to use Java Local Time

* Removing nano functions, cleaning up comments

* Added round trip test to unit tests

* Moving to threeten time instead of java time

* Removing Java Time Dependency

* Lint

* Remove E2E test for another PR. Split Unit tests into better named tests

* Lint

* Combining Encode and Decode Test for easier reading

* Removing unused methods

* Adding Time Encoding Integration Test Placeholder

* Removing Stress test from this branch and moving it to the appropriate branch

* Add integration test to make sure that encoding and decoding across a table insertion holds up

* Added Integration test, renamed functions and got rid of redundant functions

* Fix License Header

* Java Lang set to 8 in order to use Java Local Time

* Removing nano functions, cleaning up comments

* Added round trip test to unit tests

* Moving to threeten time instead of java time

* Removing Java Time Dependency

* Lint

* Remove E2E test for another PR. Split Unit tests into better named tests

* Lint

* Combining Encode and Decode Test for easier reading

* Removing unused methods

* E2E test for time encoding with supporting change to BQTableSchemaToProtoDescriptor

* Remove INT64 fields and make Time fields repeated

* Lint

* Dealing with errors in BQTableSchemaToProtoDescriptorTest

* Fixing testing to accept new Time format as integer.

* deps: update dependency com.google.cloud:google-cloud-bigquery to v1.127.10 (#955)

* chore: regenerate README (#950)

This PR was generated using Autosynth. 🌈


<details><summary>Log from Synthtool</summary>

```
2021-03-18 22:45:20,095 synthtool [DEBUG] > Executing /root/.cache/synthtool/java-bigquerystorage/.github/readme/synth.py.
On branch autosynth-readme
nothing to commit, working tree clean
2021-03-18 22:45:21,018 synthtool [DEBUG] > Wrote metadata to .github/readme/synth.metadata/synth.metadata.

```
</details>

Full log will be available here:
https://source.cloud.google.com/results/invocations/93099bcc-ab35-48d5-ab7a-79fbfbcb0bba/targets

- [ ] To automatically regenerate this PR, check this box.

* chore: regenerate README (#956)

This PR was generated using Autosynth. 🌈


<details><summary>Log from Synthtool</summary>

```
2021-03-22 15:32:14,816 synthtool [DEBUG] > Executing /root/.cache/synthtool/java-bigquerystorage/.github/readme/synth.py.
On branch autosynth-readme
nothing to commit, working tree clean
2021-03-22 15:32:15,852 synthtool [DEBUG] > Wrote metadata to .github/readme/synth.metadata/synth.metadata.

```
</details>

Full log will be available here:
https://source.cloud.google.com/results/invocations/9be4b08c-65bc-4ddb-81cf-9402bf4f1a1b/targets

- [ ] To automatically regenerate this PR, check this box.

* Fixing manual write client test

* Removing Stress Test, will be moved to its own repository for kokoro job.

* Cleaning up unneccessary diffs

* Cleaning up uneccessary diffs

* Small change to retrigger Integration Tests

Co-authored-by: WhiteSource Renovate <bot@renovateapp.com>
Co-authored-by: Yoshi Automation Bot <yoshi-automation@google.com>
  • Loading branch information
3 people committed Mar 24, 2021
1 parent fce5289 commit 4442583
Show file tree
Hide file tree
Showing 12 changed files with 191 additions and 200 deletions.
4 changes: 2 additions & 2 deletions .github/readme/synth.metadata/synth.metadata
Expand Up @@ -4,14 +4,14 @@
"git": {
"name": ".",
"remote": "https://github.com/googleapis/java-bigquerystorage.git",
"sha": "a6b53566cfd174fb36a903cbd84948defc0403e6"
"sha": "1554247cf55aa56281a530c721ab1650699a3efc"
}
},
{
"git": {
"name": "synthtool",
"remote": "https://github.com/googleapis/synthtool.git",
"sha": "79c8dd7ee768292f933012d3a69a5b4676404cda"
"sha": "78437c732a60c64895778697b078497b0988346c"
}
}
]
Expand Down
10 changes: 5 additions & 5 deletions README.md
Expand Up @@ -17,7 +17,7 @@ If you are using Maven with [BOM][libraries-bom], add this to your pom.xml file
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>libraries-bom</artifactId>
<version>19.2.1</version>
<version>19.1.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
Expand All @@ -38,25 +38,25 @@ If you are using Maven without BOM, add this to your dependencies:
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-bigquerystorage</artifactId>
<version>1.15.1</version>
<version>1.15.0</version>
</dependency>

```

If you are using Gradle 5.x or later, add this to your dependencies
```Groovy
implementation platform('com.google.cloud:libraries-bom:19.2.1')
implementation platform('com.google.cloud:libraries-bom:19.1.0')
compile 'com.google.cloud:google-cloud-bigquerystorage'
```
If you are using Gradle without BOM, add this to your dependencies
```Groovy
compile 'com.google.cloud:google-cloud-bigquerystorage:1.15.1'
compile 'com.google.cloud:google-cloud-bigquerystorage:1.15.0'
```

If you are using SBT, add this to your dependencies
```Scala
libraryDependencies += "com.google.cloud" % "google-cloud-bigquerystorage" % "1.15.1"
libraryDependencies += "com.google.cloud" % "google-cloud-bigquerystorage" % "1.15.0"
```

## Authentication
Expand Down
Expand Up @@ -47,14 +47,14 @@ public class BQTableSchemaToProtoDescriptor {
.put(TableFieldSchema.Type.BOOL, FieldDescriptorProto.Type.TYPE_BOOL)
.put(TableFieldSchema.Type.BYTES, FieldDescriptorProto.Type.TYPE_BYTES)
.put(TableFieldSchema.Type.DATE, FieldDescriptorProto.Type.TYPE_INT32)
.put(TableFieldSchema.Type.DATETIME, FieldDescriptorProto.Type.TYPE_STRING)
.put(TableFieldSchema.Type.DATETIME, FieldDescriptorProto.Type.TYPE_INT64)
.put(TableFieldSchema.Type.DOUBLE, FieldDescriptorProto.Type.TYPE_DOUBLE)
.put(TableFieldSchema.Type.GEOGRAPHY, FieldDescriptorProto.Type.TYPE_STRING)
.put(TableFieldSchema.Type.INT64, FieldDescriptorProto.Type.TYPE_INT64)
.put(TableFieldSchema.Type.NUMERIC, FieldDescriptorProto.Type.TYPE_STRING)
.put(TableFieldSchema.Type.STRING, FieldDescriptorProto.Type.TYPE_STRING)
.put(TableFieldSchema.Type.STRUCT, FieldDescriptorProto.Type.TYPE_MESSAGE)
.put(TableFieldSchema.Type.TIME, FieldDescriptorProto.Type.TYPE_STRING)
.put(TableFieldSchema.Type.TIME, FieldDescriptorProto.Type.TYPE_INT64)
.put(TableFieldSchema.Type.TIMESTAMP, FieldDescriptorProto.Type.TYPE_INT64)
.build();

Expand Down
Expand Up @@ -201,7 +201,7 @@ public void testStructComplex() throws Exception {
.build();
final Table.TableFieldSchema TEST_TIME =
Table.TableFieldSchema.newBuilder()
.setType(Table.TableFieldSchema.Type.TIME)
.setType(Table.TableFieldSchema.Type.INT64)
.setMode(Table.TableFieldSchema.Mode.NULLABLE)
.setName("test_time")
.build();
Expand Down
Expand Up @@ -38,13 +38,13 @@ public class BQTableSchemaToProtoDescriptorTest {
.put(TableFieldSchema.Type.BOOL, BoolType.getDescriptor())
.put(TableFieldSchema.Type.BYTES, BytesType.getDescriptor())
.put(TableFieldSchema.Type.DATE, Int32Type.getDescriptor())
.put(TableFieldSchema.Type.DATETIME, StringType.getDescriptor())
.put(TableFieldSchema.Type.DATETIME, Int64Type.getDescriptor())
.put(TableFieldSchema.Type.DOUBLE, DoubleType.getDescriptor())
.put(TableFieldSchema.Type.GEOGRAPHY, StringType.getDescriptor())
.put(TableFieldSchema.Type.INT64, Int64Type.getDescriptor())
.put(TableFieldSchema.Type.NUMERIC, StringType.getDescriptor())
.put(TableFieldSchema.Type.STRING, StringType.getDescriptor())
.put(TableFieldSchema.Type.TIME, StringType.getDescriptor())
.put(TableFieldSchema.Type.TIME, Int64Type.getDescriptor())
.put(TableFieldSchema.Type.TIMESTAMP, Int64Type.getDescriptor())
.build();

Expand Down
Expand Up @@ -151,7 +151,7 @@ public void decodePacked64TimeMicros_invalidHourOfDay_throwsIllegalArgumentExcep
}
}

// Date Time
// Date Time Tests
@Test
public void encodeAndDecodePacked64DatetimeMicros_validDateTime() {
// 0001/01/01 00:00:00
Expand Down
Expand Up @@ -48,6 +48,7 @@
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.threeten.bp.Instant;
import org.threeten.bp.LocalTime;

@RunWith(JUnit4.class)
public class JsonStreamWriterTest {
Expand Down Expand Up @@ -410,7 +411,7 @@ public void testSingleAppendComplexJson() throws Exception {
.setTestNumeric("1.23456")
.setTestGeo("POINT(1,1)")
.setTestTimestamp(12345678)
.setTestTime("01:00:01")
.setTestTime(CivilTimeEncoder.encodePacked64TimeMicros(LocalTime.of(1, 0, 1)))
.build();
JSONObject complex_lvl2 = new JSONObject();
complex_lvl2.put("test_int", 3);
Expand All @@ -431,7 +432,7 @@ public void testSingleAppendComplexJson() throws Exception {
json.put("test_numeric", "1.23456");
json.put("test_geo", "POINT(1,1)");
json.put("test_timestamp", 12345678);
json.put("test_time", "01:00:01");
json.put("test_time", CivilTimeEncoder.encodePacked64TimeMicros(LocalTime.of(1, 0, 1)));
JSONArray jsonArr = new JSONArray();
jsonArr.put(json);

Expand Down
@@ -0,0 +1,166 @@
/*
* Copyright 2021 Google LLC
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package com.google.cloud.bigquery.storage.v1beta2.it;

import static org.junit.Assert.assertEquals;

import com.google.api.core.ApiFuture;
import com.google.cloud.ServiceOptions;
import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.DatasetInfo;
import com.google.cloud.bigquery.Field.Mode;
import com.google.cloud.bigquery.FieldValueList;
import com.google.cloud.bigquery.Schema;
import com.google.cloud.bigquery.StandardSQLTypeName;
import com.google.cloud.bigquery.StandardTableDefinition;
import com.google.cloud.bigquery.TableId;
import com.google.cloud.bigquery.TableInfo;
import com.google.cloud.bigquery.TableResult;
import com.google.cloud.bigquery.storage.v1beta2.AppendRowsResponse;
import com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient;
import com.google.cloud.bigquery.storage.v1beta2.CivilTimeEncoder;
import com.google.cloud.bigquery.storage.v1beta2.JsonStreamWriter;
import com.google.cloud.bigquery.storage.v1beta2.TableName;
import com.google.cloud.bigquery.testing.RemoteBigQueryHelper;
import com.google.protobuf.Descriptors;
import java.io.IOException;
import java.util.Iterator;
import java.util.concurrent.ExecutionException;
import org.json.JSONArray;
import org.json.JSONObject;
import org.junit.AfterClass;
import org.junit.Assert;
import org.junit.BeforeClass;
import org.junit.Test;
import org.threeten.bp.LocalDateTime;
import org.threeten.bp.LocalTime;

public class ITBigQueryTimeEncoderTest {
private static final String DATASET = RemoteBigQueryHelper.generateDatasetName();
private static final String TABLE = "testtable";
private static final String DESCRIPTION = "BigQuery Write Java manual client test dataset";

private static BigQueryWriteClient client;
private static TableInfo tableInfo;
private static BigQuery bigquery;

@BeforeClass
public static void beforeClass() throws IOException {
client = BigQueryWriteClient.create();

RemoteBigQueryHelper bigqueryHelper = RemoteBigQueryHelper.create();
bigquery = bigqueryHelper.getOptions().getService();
DatasetInfo datasetInfo =
DatasetInfo.newBuilder(/* datasetId = */ DATASET).setDescription(DESCRIPTION).build();
bigquery.create(datasetInfo);
tableInfo =
TableInfo.newBuilder(
TableId.of(DATASET, TABLE),
StandardTableDefinition.of(
Schema.of(
com.google.cloud.bigquery.Field.newBuilder(
"test_str", StandardSQLTypeName.STRING)
.build(),
com.google.cloud.bigquery.Field.newBuilder(
"test_time_micros", StandardSQLTypeName.TIME)
.setMode(Mode.REPEATED)
.build(),
com.google.cloud.bigquery.Field.newBuilder(
"test_datetime_micros", StandardSQLTypeName.DATETIME)
.setMode(Mode.REPEATED)
.build())))
.build();
bigquery.create(tableInfo);
}

@AfterClass
public static void afterClass() {
if (client != null) {
client.close();
}
if (bigquery != null) {
RemoteBigQueryHelper.forceDelete(bigquery, DATASET);
}
}

@Test
public void TestTimeEncoding()
throws IOException, InterruptedException, ExecutionException,
Descriptors.DescriptorValidationException {
TableName parent = TableName.of(ServiceOptions.getDefaultProjectId(), DATASET, TABLE);
try (JsonStreamWriter jsonStreamWriter =
JsonStreamWriter.newBuilder(parent.toString(), tableInfo.getDefinition().getSchema())
.createDefaultStream()
.build()) {
JSONObject row = new JSONObject();
row.put("test_str", "Start of the day");
row.put(
"test_time_micros",
new JSONArray(
new long[] {
CivilTimeEncoder.encodePacked64TimeMicros(LocalTime.of(13, 14, 15, 16_000_000)),
CivilTimeEncoder.encodePacked64TimeMicros(LocalTime.of(23, 59, 59, 999_999_000)),
CivilTimeEncoder.encodePacked64TimeMicros(LocalTime.of(0, 0, 0, 0)),
CivilTimeEncoder.encodePacked64TimeMicros(LocalTime.of(1, 2, 3, 4_000)),
CivilTimeEncoder.encodePacked64TimeMicros(LocalTime.of(5, 6, 7, 8_000))
}));
row.put(
"test_datetime_micros",
new JSONArray(
new long[] {
CivilTimeEncoder.encodePacked64DatetimeMicros(
LocalDateTime.of(1, 1, 1, 12, 0, 0, 0)),
CivilTimeEncoder.encodePacked64DatetimeMicros(
LocalDateTime.of(1995, 5, 19, 10, 30, 45, 0)),
CivilTimeEncoder.encodePacked64DatetimeMicros(
LocalDateTime.of(2000, 1, 1, 0, 0, 0, 0)),
CivilTimeEncoder.encodePacked64DatetimeMicros(
LocalDateTime.of(2026, 3, 11, 5, 45, 12, 9_000_000)),
CivilTimeEncoder.encodePacked64DatetimeMicros(
LocalDateTime.of(2050, 1, 2, 3, 4, 5, 6_000)),
}));
JSONArray jsonArr = new JSONArray(new JSONObject[] {row});
ApiFuture<AppendRowsResponse> response = jsonStreamWriter.append(jsonArr, -1);
Assert.assertFalse(response.get().getAppendResult().hasOffset());
TableResult result =
bigquery.listTableData(
tableInfo.getTableId(), BigQuery.TableDataListOption.startIndex(0L));
Iterator<FieldValueList> iter = result.getValues().iterator();
FieldValueList currentRow;
currentRow = iter.next();
assertEquals("Start of the day", currentRow.get(0).getValue());
assertEquals("13:14:15.016000", currentRow.get(1).getRepeatedValue().get(0).getStringValue());
assertEquals("23:59:59.999999", currentRow.get(1).getRepeatedValue().get(1).getStringValue());
assertEquals("00:00:00", currentRow.get(1).getRepeatedValue().get(2).getStringValue());
assertEquals("01:02:03.000004", currentRow.get(1).getRepeatedValue().get(3).getStringValue());
assertEquals("05:06:07.000008", currentRow.get(1).getRepeatedValue().get(4).getStringValue());

assertEquals(
"0001-01-01T12:00:00", currentRow.get(2).getRepeatedValue().get(0).getStringValue());
assertEquals(
"1995-05-19T10:30:45", currentRow.get(2).getRepeatedValue().get(1).getStringValue());
assertEquals(
"2000-01-01T00:00:00", currentRow.get(2).getRepeatedValue().get(2).getStringValue());
assertEquals(
"2026-03-11T05:45:12.009000",
currentRow.get(2).getRepeatedValue().get(3).getStringValue());
assertEquals(
"2050-01-02T03:04:05.000006",
currentRow.get(2).getRepeatedValue().get(4).getStringValue());
}
}
}
Expand Up @@ -41,6 +41,7 @@
import org.junit.BeforeClass;
import org.junit.Test;
import org.threeten.bp.Duration;
import org.threeten.bp.LocalDateTime;

/** Integration tests for BigQuery Write API. */
public class ITBigQueryWriteManualClientTest {
Expand Down Expand Up @@ -240,7 +241,9 @@ public void testJsonStreamWriterCommittedStream()
JSONObject row1 = new JSONObject();
row1.put("test_str", "aaa");
row1.put("test_numerics", new JSONArray(new String[] {"123.4", "-9000000"}));
row1.put("test_datetime", "2020-10-1 12:00:00");
row1.put(
"test_datetime",
CivilTimeEncoder.encodePacked64DatetimeMicros(LocalDateTime.of(2020, 10, 1, 12, 0)));
JSONArray jsonArr1 = new JSONArray(new JSONObject[] {row1});

ApiFuture<AppendRowsResponse> response1 = jsonStreamWriter.append(jsonArr1, -1);
Expand Down Expand Up @@ -313,7 +316,9 @@ public void testJsonStreamWriterWithDefaultStream()
JSONObject row1 = new JSONObject();
row1.put("test_str", "aaa");
row1.put("test_numerics", new JSONArray(new String[] {"123.4", "-9000000"}));
row1.put("test_datetime", "2020-10-1 12:00:00");
row1.put(
"test_datetime",
CivilTimeEncoder.encodePacked64DatetimeMicros(LocalDateTime.of(2020, 10, 1, 12, 0)));
JSONArray jsonArr1 = new JSONArray(new JSONObject[] {row1});

ApiFuture<AppendRowsResponse> response1 = jsonStreamWriter.append(jsonArr1, -1);
Expand Down

0 comments on commit 4442583

Please sign in to comment.