Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spark 3.1.2 support does not work with protobuf (sparksql-scalapb) #35

Closed
RayRoestenburg opened this issue Aug 9, 2021 · 10 comments
Closed

Comments

@RayRoestenburg
Copy link
Contributor

RayRoestenburg commented Aug 9, 2021

Trying out a spark project with protobuf, when the executor runs:

Caused by: java.lang.NoSuchMethodError: com.google.protobuf.CodedInputStream.readStringRequireUtf8()Ljava/lang/String;
	at sensors.proto.Data$.parseFrom(Data.scala:140)
	at sensors.proto.Data$.parseFrom(Data.scala:126)
	at scalapb.GeneratedMessageCompanion.parseFrom(GeneratedMessageCompanion.scala:185)
	at scalapb.GeneratedMessageCompanion.parseFrom$(GeneratedMessageCompanion.scala:185)
	at sensors.proto.Data$.parseFrom(Data.scala:126)
	at cloudflow.streamlets.proto.ProtoCodec.$anonfun$decode$1(ProtoCodec.scala:27)

Probably because of many protobuf libs on the classpath,
looking at sparksql-scalapb, hadoop contains an outdated protobuf library that needs to be shaded:
https://scalapb.github.io/docs/sparksql/#setting-up-your-project
https://github.com/thesamet/sparksql-scalapb-test/blob/master/build.sbt
But cloudflow does not use sbt-assembly, so we need to find another way to ensure that the right protobuf library is used.
(could also have another reason, but we should be able to see from dependencies)
Should probably try https://github.com/coursier/sbt-shading to shade the dependencies.

@RayRoestenburg RayRoestenburg changed the title Spark + protobuf Spark 3.1.2 support does not work with protobuf (sparksql-scalapb) Aug 9, 2021
@debasishg
Copy link
Contributor

Does this example work with Cloudflow 2.2.0 + Spark 3.1.2 + Avro ?

@debasishg
Copy link
Contributor

debasishg commented Aug 10, 2021

shading should fix it .. https://stackoverflow.com/a/41609653/16405999. Since we don't use sbt assembly, we can use https://github.com/coursier/sbt-shading. But in that case do we need to use the sbt runner from https://github.com/coursier/sbt-shading/blob/master/sbt ?

@debasishg
Copy link
Contributor

debasishg commented Aug 21, 2021

Tried to shade the module as per https://scalapb.github.io/docs/sparksql/ but ran into the following in the logs for spark-sensors-proto-process-6ca7837b719373ca-driver ..

"Exception in thread "main" java.lang.AbstractMethodError: Method shadeproto/descriptor/FileDescriptorProto$.parseFrom(Lcom/google/protobuf/CodedInputStream;)Lscalapb/GeneratedMessage; is abstract
	at shadeproto.descriptor.FileDescriptorProto$.parseFrom(FileDescriptorProto.scala)
	at scalapb.GeneratedMessageCompanion.parseFrom(GeneratedMessageCompanion.scala:185)
	at scalapb.GeneratedMessageCompanion.parseFrom$(GeneratedMessageCompanion.scala:185)
	at shadeproto.descriptor.FileDescriptorProto$.parseFrom(FileDescriptorProto.scala:258)
	at shadeproto.wrappers.WrappersProto$.scalaDescriptor$lzycompute(WrappersProto.scala:35)
	at shadeproto.wrappers.WrappersProto$.scalaDescriptor(WrappersProto.scala:34)
	at shadeproto.wrappers.DoubleValue$.scalaDescriptor(DoubleValue.scala:114)
	at scalapb.spark.SchemaOptions$.<init>(SchemaOptions.scala:51)
	at scalapb.spark.SchemaOptions$.<clinit>(SchemaOptions.scala)
	at scalapb.spark.ProtoSQL$.<init>(ProtoSQL.scala:198)
	at scalapb.spark.ProtoSQL$.<clinit>(ProtoSQL.scala)
	at scalapb.spark.Implicits$$anon$1.protoSql$lzycompute(TypedEncoders.scala:129)
	at scalapb.spark.Implicits$$anon$1.protoSql(TypedEncoders.scala:129)
	at scalapb.spark.Implicits$$anon$1.protoSql(TypedEncoders.scala:127)
	at scalapb.spark.ToCatalystHelpers.messageToCatalyst(ToCatalystHelpers.scala:27)
	at scalapb.spark.ToCatalystHelpers.messageToCatalyst$(ToCatalystHelpers.scala:23)
	at scalapb.spark.Implicits$$anon$1.messageToCatalyst(TypedEncoders.scala:127)
	at scalapb.spark.TypedEncoders$MessageTypedEncoder.toCatalyst(TypedEncoders.scala:48)
	at frameless.TypedExpressionEncoder$.apply(TypedExpressionEncoder.scala:28)
	at scalapb.spark.Implicits.typedEncoderToEncoder(TypedEncoders.scala:123)
	at scalapb.spark.Implicits.typedEncoderToEncoder$(TypedEncoders.scala:120)
	at scalapb.spark.Implicits$.typedEncoderToEncoder(TypedEncoders.scala:126)
	at sensors.proto.MovingAverageSparklet$$anon$1.buildStreamingQueries(MovingAverageSparklet.scala:38)
	at cloudflow.spark.SparkStreamlet.run(SparkStreamlet.scala:91)
	at cloudflow.spark.SparkStreamlet.run$(SparkStreamlet.scala:82)
	at sensors.proto.MovingAverageSparklet.run(MovingAverageSparklet.scala:30)
	at sensors.proto.MovingAverageSparklet.run(MovingAverageSparklet.scala:30)
	at cloudflow.streamlets.Streamlet.run(Streamlet.scala:107)
	at cloudflow.runner.Runner$.run(Runner.scala:67)
	at cloudflow.runner.Runner$.main(Runner.scala:45)
	at cloudflow.runner.Runner.main(Runner.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Code in branch

Looks somewhat similar to scalapb/ScalaPB#1122 ..

@andreaTP
Copy link
Contributor

@debasishg , after further investigation the result is:

  • we can't use protobuf-java 2.5.0 since ScalaPB doesn't support it and we are using relevant code
  • shading is, really, the only option

taking a look at what you did, probably the missing steps to hack together something are:

  • remove all the libraries explicitly, since they are going to be included in the jar, e.g.:
extraDockerInstructions := Seq(
    Instructions.Run.shell(Seq("rm", "/opt/cloudflow/*.jar"))
 ),
  • add back the generated uber-jar in /opt/cloudflow and rename it /opt/cloudflow/cloudflow-runner.jar

if the resulting Docker image works we can cleanup the process.

@debasishg
Copy link
Contributor

@andreaTP also spark cannot work with protobuf-java 3.5.x - hence I guess the only option is to ..

  • shade protobuf-java within scalapb so that it uses the shaded version
  • let spark use the 2.5.0 version (it uses APIs which are incompatible with 3.5.x
  • not sure what version akka-grpc needs though - will have to check

Hence I don't think we can delete all versions other than 3.5.x

@andreaTP
Copy link
Contributor

@debasishg all good but:

Hence I don't think we can delete all versions other than 3.5.x

We strictly have to remove all versions but 2.5.0, which is the one used by Spark itself, 3.X will be shaded in the resulting application artifact and be used by akka-grpc, scalapb and Cloudflow.

@debasishg
Copy link
Contributor

@andreaTP Ok, so I shaded akka-protobuf as well, so that it also uses 3.15.8, which it requires. The problem is if I keep only 2.5.0 unshaded, the protobuf compiler complains of missing class ..

Compiling 2 protobuf files to /Users/debasishghosh/lightbend/cloudflow-contrib/examples/spark-sensors-proto/target/scala-2.12/akka-grpc/main
[error] java.lang.NoClassDefFoundError: com/google/protobuf/compiler/PluginProtos$CodeGeneratorRequest

@andreaTP
Copy link
Contributor

The problem is if I keep only 2.5.0 unshaded, the protobuf compiler complains of missing class ..

You probably need an extra step, move the compilation of protobuf files to a separate sub-project, depends on the artifact, and shade the proto dependency.

@debasishg
Copy link
Contributor

debasishg commented Aug 25, 2021

Here's the latest status on this ..

After exploring lots of options, I figured out that the only way to achieve the goal is to do the following steps:

  • shade rules for protobuf-java (version 3.15.8) coming from scalapb
  • shade rules for protobuf-java (version 3.15.8) coming from akka-protobuf
  • shade rules for other sources of protobuf-java for any version other than 2.5.0
  • prepare an assembly of all jars except Spark so that all instances of protobuf-java v 3.15.8 are shaded and their transitive dependencies changed accordingly. This will result in an uber-jar + the Spark jars
  • use dependencyOverrides to base the version of protobuf-java to 2.5.0, the one that Spark uses

This implies a redesign of how we package the image in cloudflow today for spark plugin. This is a significant effort I think. Should we invest time and effort on this ? @RayRoestenburg @andreaTP

@RayRoestenburg
Copy link
Contributor Author

This should be retried with 0.2.0, which upgrades to Spark 3.2.0. Closing this ticket for that reason.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants