You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Spark-Bench version (version number, tag, or git commit hash)
2.3.0
Details of your cluster setup (Spark version, Standalone/Yarn/Local/Etc)
yarn
Scala version on your cluster
2.12.8
Your exact configuration file (with system details anonymized for security)
spark-bench = {
spark-submit-config = [{
workload-suites = [
{
descr = "One run of SparkPi and that's it!"
benchmark-output = "console"
workloads = [
{
name = "sparkpi"
slices = 10
}
]
}
]
}]
}
Relevant stacktrace
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
2020-03-05 15:05:59,292 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at com.ibm.sparktc.sparkbench.cli.CLIKickoff$.main(CLIKickoff.scala:26)
at com.ibm.sparktc.sparkbench.cli.CLIKickoff.main(CLIKickoff.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2020-03-05 15:05:59,418 INFO util.ShutdownHookManager: Shutdown hook called
2020-03-05 15:05:59,419 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-dc94b54b-b6ff-46e4-a4b6-dde506128cee
Exception in thread "main" java.lang.Exception: spark-submit failed to complete properly given these arguments:
/usr/local/spark/bin/spark-submit
--class
com.ibm.sparktc.sparkbench.cli.CLIKickoff
--master
yarn
/home/hadoop/spark-bench_2.3.0_0.4.0-RELEASE/lib/spark-bench-2.3.0_0.4.0-RELEASE.jar
{"spark-bench":{"spark-submit-config":[{"workload-suites":[{"descr":"One run of SparkPi and that's it!","workloads":[{"name":"sparkpi","slices":10}]}]}]}}
at com.ibm.sparktc.sparkbench.sparklaunch.submission.sparksubmit.SparkSubmit$.submit(SparkSubmit.scala:51)
at com.ibm.sparktc.sparkbench.sparklaunch.submission.sparksubmit.SparkSubmit$.launch(SparkSubmit.scala:34)
at com.ibm.sparktc.sparkbench.sparklaunch.SparkLaunch$.com$ibm$sparktc$sparkbench$sparklaunch$SparkLaunch$$launch$1(SparkLaunch.scala:58)
at com.ibm.sparktc.sparkbench.sparklaunch.SparkLaunch$$anonfun$launchJobs$2.apply(SparkLaunch.scala:65)
at com.ibm.sparktc.sparkbench.sparklaunch.SparkLaunch$$anonfun$launchJobs$2.apply(SparkLaunch.scala:65)
at scala.collection.immutable.List.foreach(List.scala:381)
at com.ibm.sparktc.sparkbench.sparklaunch.SparkLaunch$.launchJobs(SparkLaunch.scala:65)
at com.ibm.sparktc.sparkbench.sparklaunch.SparkLaunch$.main(SparkLaunch.scala:38)
at com.ibm.sparktc.sparkbench.sparklaunch.SparkLaunch.main(SparkLaunch.scala)
Description of your problem and any other relevant info
But to be fair, you could simply checkout the project and make the changes for a newer Spark version as well as Scala 2.12. I mean, it's a simple Scala project with an SBT file.
Spark-Bench version (version number, tag, or git commit hash)
2.3.0
Details of your cluster setup (Spark version, Standalone/Yarn/Local/Etc)
yarn
Scala version on your cluster
2.12.8
Your exact configuration file (with system details anonymized for security)
Relevant stacktrace
Description of your problem and any other relevant info
/bin/spark-bench.sh ./examples/minimal-example.conf gives above error.
The text was updated successfully, but these errors were encountered: