Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: New Docker-Image breaks jaeger-operator spark-dependencies-jobs #138

Open
rriverak opened this issue Mar 27, 2024 · 3 comments
Open

Comments

@rriverak
Copy link

What happened?

Since the last update (20 days ago) of the Docker image which overwrote the 'latest' tag, the spark-job of the jaeger-operator stopped Working.

see operator issue here: jaegertracing/jaeger-operator#2508

Steps to reproduce

Use the latest version of jaeger-operator (1.49.0) and the Spark-jobs will fail.

Expected behavior

At least the old Docker-Image should be available. Overwriting the latest tag prevents any Workaround.

Relevant log output

jaeger-operator-jaeger-spark-dependencies log
---
WARNING: sun.reflect.Reflection.getCallerClass is not supported. This will impact performance.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/app/jaeger-spark-dependencies-0.0.1-SNAPSHOT.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Exception in thread "main" java.io.IOException: Failed to open native connection to Cassandra at {10.240.20.201}:9042
        at com.datastax.spark.connector.cql.CassandraConnector$.createSession(CassandraConnector.scala:168)
        at com.datastax.spark.connector.cql.CassandraConnector$.$anonfun$sessionCache$1(CassandraConnector.scala:154)
        at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:32)
        at com.datastax.spark.connector.cql.RefCountedCache.syncAcquire(RefCountedCache.scala:69)
        at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:57)
        at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:79)
        at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:111)
        at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:122)
        at com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:332)
        at com.datastax.spark.connector.cql.Schema$.tableFromCassandra(Schema.scala:352)
        at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider.tableDef(CassandraTableRowReaderProvider.scala:50)
        at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider.tableDef$(CassandraTableRowReaderProvider.scala:50)
        at com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef$lzycompute(CassandraTableScanRDD.scala:63)
        at com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef(CassandraTableScanRDD.scala:63)
        at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider.verify(CassandraTableRowReaderProvider.scala:137)
        at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider.verify$(CassandraTableRowReaderProvider.scala:136)
        at com.datastax.spark.connector.rdd.CassandraTableScanRDD.verify(CassandraTableScanRDD.scala:63)
        at com.datastax.spark.connector.rdd.CassandraTableScanRDD.getPartitions(CassandraTableScanRDD.scala:263)
        at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:294)
        at scala.Option.getOrElse(Option.scala:189)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:290)
        at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:49)
        at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:294)
        at scala.Option.getOrElse(Option.scala:189)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:290)
        at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:49)
        at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:294)
        at scala.Option.getOrElse(Option.scala:189)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:290)
        at org.apache.spark.Partitioner$.$anonfun$defaultPartitioner$4(Partitioner.scala:78)
        at org.apache.spark.Partitioner$.$anonfun$defaultPartitioner$4$adapted(Partitioner.scala:78)
        at scala.collection.immutable.List.map(List.scala:293)
        at org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:78)
        at org.apache.spark.rdd.PairRDDFunctions.$anonfun$groupByKey$6(PairRDDFunctions.scala:636)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:410)
        at org.apache.spark.rdd.PairRDDFunctions.groupByKey(PairRDDFunctions.scala:636)
        at org.apache.spark.api.java.JavaPairRDD.groupByKey(JavaPairRDD.scala:561)
        at io.jaegertracing.spark.dependencies.cassandra.CassandraDependenciesJob.run(CassandraDependenciesJob.java:169)
        at io.jaegertracing.spark.dependencies.DependenciesSparkJob.run(DependenciesSparkJob.java:60)
        at io.jaegertracing.spark.dependencies.DependenciesSparkJob.main(DependenciesSparkJob.java:40)
Caused by: java.lang.NoClassDefFoundError: com/codahale/metrics/JmxReporter
        at com.datastax.driver.core.Metrics.<init>(Metrics.java:146)
        at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1501)
        at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:451)
        at com.datastax.spark.connector.cql.CassandraConnector$.createSession(CassandraConnector.scala:161)
        ... 41 more
Caused by: java.lang.ClassNotFoundException: com.codahale.metrics.JmxReporter
        at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(Unknown Source)
        at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(Unknown Source)
        at java.base/java.lang.ClassLoader.loadClass(Unknown Source)
        ... 45 more

Screenshot

No response

Additional context

No response

Jaeger backend version

No response

SDK

No response

Pipeline

No response

Stogage backend

No response

Operating system

No response

Deployment model

Kubernetes

Deployment configs

No response

@rriverak rriverak added the bug label Mar 27, 2024
@yurishkuro
Copy link
Member

All previous images are still available: https://github.com/jaegertracing/spark-dependencies/pkgs/container/spark-dependencies%2Fspark-dependencies/versions

Running latest in production is not commended.

@yurishkuro
Copy link
Member

Sadly, the ci.yaml does not include any tests against Cassandra, only Elasticsearch. It also doesn't log anything about containers being executed - would be nice to include docker logs as an after-step.

@rriverak
Copy link
Author

rriverak commented Apr 2, 2024

All previous images are still available: https://github.com/jaegertracing/spark-dependencies/pkgs/container/spark-dependencies%2Fspark-dependencies/versions

Running latest in production is not commended.

The lack of other Tags makes it difficult to use anything other than latest in Production. (which is simply default)
We have now referenced the SHA of the old Image, which is also not recommended for Production.

A versioning according to SemVer would be helpful to allow Image Pinning by the Jaeger Operator.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants