Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[#3305] test(spark-connector): move spark connector integration test from integration-test module to spark-connector #3307

Merged
merged 2 commits into from May 23, 2024

Conversation

FANNG1
Copy link
Contributor

@FANNG1 FANNG1 commented May 8, 2024

What changes were proposed in this pull request?

move spark connector integration test from integration-test module to spark-connector

Why are the changes needed?

Fix: #3305

Does this PR introduce any user-facing change?

no

How was this patch tested?

moving test location

@FANNG1 FANNG1 marked this pull request as draft May 8, 2024 08:26
@@ -54,6 +54,7 @@ kafka = "3.4.0"
curator = "2.12.0"
awaitility = "4.2.1"
servlet = "3.1.0"
jodd = "3.5.2"
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TrinoQueryRunner use this lib

@FANNG1
Copy link
Contributor Author

FANNG1 commented May 20, 2024

This is blocked by #3441 , there are two ways to bypass:

  1. not support embedded mode for spark integration test
  2. replace java-client-runtime with java-client when testing, this is actually the same way why Integration tests works for spark connector IT for now.

I prefer option2 is no better solutions. @jerryshao , @qqqttt123 @yuqi1129 @diqiu50 @mchades , WDYT?

@jerryshao
Copy link
Collaborator

Use solution 2 for now.

@FANNG1
Copy link
Contributor Author

FANNG1 commented May 21, 2024

Use solution 2 for now.

ok

@FANNG1 FANNG1 force-pushed the move-test branch 2 times, most recently from d04f470 to b24aa23 Compare May 22, 2024 03:59
@FANNG1 FANNG1 marked this pull request as ready for review May 22, 2024 05:20
@FANNG1 FANNG1 requested review from yuqi1129 and jerryshao May 22, 2024 05:21
@FANNG1
Copy link
Contributor Author

FANNG1 commented May 22, 2024

@yuqi1129 @jerryshao could you help to review this PR, thanks

dependsOn(tasks.jar)

doFirst {
environment("GRAVITINO_CI_HIVE_DOCKER_IMAGE", "datastrato/gravitino-ci-hive:0.1.10")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should be 0.1.11

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I remembered we have a 0.1.12 version? @yuqi1129

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated

@FANNG1
Copy link
Contributor Author

FANNG1 commented May 22, 2024

@yuqi1129 do you have any other comment, @jerryshao do you want to review this PR?

implementation(project(":clients:client-java-runtime", configuration = "shadow"))
implementation(project(":spark-connector:spark-connector"))

implementation("org.apache.iceberg:iceberg-spark-runtime-${sparkMajorVersion}_$scalaVersion:$icebergVersion")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need to package the iceberg runtime jar into spark connector?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we'd better not package Iceberg runtime jar, but there are some limits now, we now register Iceberg extensions on startup. I prefer to do it in #3396

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it a big work? I think it should be fixed in this version, we cannot ship a Spark connector with iceberg runtime packaged in, it will introduce lots of issues.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not a big work, but we have to keep consistent about this,

  1. if not registering Iceberg extensions, Iceberg catalogs couldn't be dymatic loaded in the future
  2. when not registering Iceberg extensions, I prefer to add a configuration.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please fix this before 0.5.1 is shipped.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

@FANNG1
Copy link
Contributor Author

FANNG1 commented May 23, 2024

@jerryshao @yuqi1129 comments are addressed, please help to review again

@yuqi1129
Copy link
Contributor

I have no further comments.

@jerryshao jerryshao merged commit cf42811 into datastrato:main May 23, 2024
22 checks passed
github-actions bot pushed a commit that referenced this pull request May 23, 2024
…from integration-test module to spark-connector (#3307)

### What changes were proposed in this pull request?
move spark connector integration test from integration-test module to
spark-connector

### Why are the changes needed?

Fix: #3305 

### Does this PR introduce _any_ user-facing change?
no

### How was this patch tested?
moving test location
FANNG1 added a commit to FANNG1/gravitino that referenced this pull request May 23, 2024
…tion test from integration-test module to spark-connector (datastrato#3307)

move spark connector integration test from integration-test module to
spark-connector

Fix: datastrato#3305

no

moving test location
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Improvement] move spark connector integration test from integration-test module to spark-connector module
3 participants