Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spark bench fails on latest version of spark 2.4.2. #193

Open
sajanraj opened this issue Mar 5, 2020 · 4 comments
Open

Spark bench fails on latest version of spark 2.4.2. #193

sajanraj opened this issue Mar 5, 2020 · 4 comments

Comments

@sajanraj
Copy link

sajanraj commented Mar 5, 2020

Spark-Bench version (version number, tag, or git commit hash)

2.3.0

Details of your cluster setup (Spark version, Standalone/Yarn/Local/Etc)

yarn

Scala version on your cluster

2.12.8

Your exact configuration file (with system details anonymized for security)

spark-bench = {
  spark-submit-config = [{
    workload-suites = [
      {
        descr = "One run of SparkPi and that's it!"
        benchmark-output = "console"
        workloads = [
          {
            name = "sparkpi"
            slices = 10
          }
        ]
      }
    ]
  }]
}

Relevant stacktrace

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
2020-03-05 15:05:59,292 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
	at com.ibm.sparktc.sparkbench.cli.CLIKickoff$.main(CLIKickoff.scala:26)
	at com.ibm.sparktc.sparkbench.cli.CLIKickoff.main(CLIKickoff.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2020-03-05 15:05:59,418 INFO util.ShutdownHookManager: Shutdown hook called
2020-03-05 15:05:59,419 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-dc94b54b-b6ff-46e4-a4b6-dde506128cee
Exception in thread "main" java.lang.Exception: spark-submit failed to complete properly given these arguments: 
	/usr/local/spark/bin/spark-submit
--class
com.ibm.sparktc.sparkbench.cli.CLIKickoff
--master
yarn
/home/hadoop/spark-bench_2.3.0_0.4.0-RELEASE/lib/spark-bench-2.3.0_0.4.0-RELEASE.jar
{"spark-bench":{"spark-submit-config":[{"workload-suites":[{"descr":"One run of SparkPi and that's it!","workloads":[{"name":"sparkpi","slices":10}]}]}]}}
	at com.ibm.sparktc.sparkbench.sparklaunch.submission.sparksubmit.SparkSubmit$.submit(SparkSubmit.scala:51)
	at com.ibm.sparktc.sparkbench.sparklaunch.submission.sparksubmit.SparkSubmit$.launch(SparkSubmit.scala:34)
	at com.ibm.sparktc.sparkbench.sparklaunch.SparkLaunch$.com$ibm$sparktc$sparkbench$sparklaunch$SparkLaunch$$launch$1(SparkLaunch.scala:58)
	at com.ibm.sparktc.sparkbench.sparklaunch.SparkLaunch$$anonfun$launchJobs$2.apply(SparkLaunch.scala:65)
	at com.ibm.sparktc.sparkbench.sparklaunch.SparkLaunch$$anonfun$launchJobs$2.apply(SparkLaunch.scala:65)
	at scala.collection.immutable.List.foreach(List.scala:381)
	at com.ibm.sparktc.sparkbench.sparklaunch.SparkLaunch$.launchJobs(SparkLaunch.scala:65)
	at com.ibm.sparktc.sparkbench.sparklaunch.SparkLaunch$.main(SparkLaunch.scala:38)
	at com.ibm.sparktc.sparkbench.sparklaunch.SparkLaunch.main(SparkLaunch.scala)

Description of your problem and any other relevant info

/bin/spark-bench.sh ./examples/minimal-example.conf gives above error.

@LorenzBuehmann
Copy link

The issue is on your end given that you're using Scala 2.12 but this API is built on Scala 2.11

@sajanraj
Copy link
Author

@LorenzBuehmann ,is there any updated versions available? or some mention someone who working on it !.
any way thanks for the reply.

@LorenzBuehmann
Copy link

LorenzBuehmann commented Nov 2, 2020

I don't know, I'm not part of the project team.

But to be fair, you could simply checkout the project and make the changes for a newer Spark version as well as Scala 2.12. I mean, it's a simple Scala project with an SBT file.

@geofflangenderfer
Copy link

checkout sparkMeasure, which works with spark3: https://github.com/LucaCanali/sparkMeasure

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants