Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support Spark3.2 and Java11 #4

Open
a140262 opened this issue Sep 26, 2022 · 3 comments
Open

support Spark3.2 and Java11 #4

a140262 opened this issue Sep 26, 2022 · 3 comments

Comments

@a140262
Copy link

a140262 commented Sep 26, 2022

What would you like to be added?

I am trying to build a CSS docker image to support Spark3.2 based on java11 (FROM amazoncorretto:11 ), not able to compile it so far. How can I make it work with Spark3.2? It is acceptable if CSS works with Spark3.2 & java8. Please help.

Here is the list of version changes in pom.xml file:

<java.version>11</java.version>
<maven.compiler.source>${java.version}</maven.compiler.source>
<maven.compiler.target>${java.version}</maven.compiler.target>
<maven.version>3.8.6</maven.version>
<spark.version>3.2.0</spark.version>
<hadoop.version>3.2.1</hadoop.version>

Error:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project css-common_2.12: Compilation failure: Compilation failure: 
[ERROR] /tmp/CloudShuffleService/common/src/main/java/com/bytedance/css/common/unsafe/Platform.java:[27,16] cannot find symbol
[ERROR]   symbol:   class Cleaner
[ERROR]   location: package sun.misc
[ERROR] /tmp/CloudShuffleService/common/src/main/java/com/bytedance/css/common/unsafe/Platform.java:[174,7] cannot find symbol
[ERROR]   symbol:   class Cleaner
[ERROR]   location: class com.bytedance.css.common.unsafe.Platform
[ERROR] /tmp/CloudShuffleService/common/src/main/java/com/bytedance/css/common/unsafe/Platform.java:[174,25] cannot find symbol
[ERROR]   symbol:   variable Cleaner
[ERROR]   location: class com.bytedance.css.common.unsafe.Platform
[ERROR] -> [Help 1]

The replication step is:

  1. spin up a jdk11 container with maven installed:
docker build -t jdk11 .
docker run -it jdk11

The jdk11 Dockerfile looks like this:

FROM amazoncorretto:11
ARG MAVEN_VERSION=3.8.6
ARG BASE_URL=https://apache.osuosl.org/maven/maven-3/${MAVEN_VERSION}/binaries

# install maven
RUN yum update -y && yum install -y git tar
RUN mkdir -p /usr/share/maven /usr/share/maven/ref \
 && curl -fsSL -o /tmp/apache-maven.tar.gz ${BASE_URL}/apache-maven-${MAVEN_VERSION}-bin.tar.gz \
 && tar -xzf /tmp/apache-maven.tar.gz -C /usr/share/maven --strip-components=1 \
 && rm -f /tmp/apache-maven.tar.gz \
 && ln -s /usr/share/maven/bin/mvn /usr/bin/mvn
ENV MAVEN_HOME /usr/share/maven
ENV MAVEN_CONFIG "$USER_HOME_DIR/.m2"
  1. After login to the jdk11 container, upgrade spark & java versions in pom.xml, then compile the project:
git clone https://github.com/bytedance/CloudShuffleService.git /tmp/CloudShuffleService
cd /tmp/CloudShuffleService
vi pom.xml  
./build.sh

Why is this needed?

need to run the latest Spark with the CSS.

@bdyx123
Copy link
Collaborator

bdyx123 commented Sep 28, 2022

We support Spark 3.0 currently, and we are doing the support for Spark 3.2

@bdyx123
Copy link
Collaborator

bdyx123 commented Oct 9, 2022

We have tested it, spark-shuffle-manager-3 is also suitable for Spark 3.2 @a140262

@melodyyangaws
Copy link

melodyyangaws commented Oct 25, 2022

how to compile it for Spark 3.2? these didn't work for me when compiling the CSS .

<java.version>11</java.version>
<spark.version>3.2.0</spark.version>
<hadoop.version>3.2.1</hadoop.version> 

Please provide more details on your test.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants