You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
If you are interested in working on this issue or have submitted a pull request, please leave a comment
What is the outcome that you are trying to reach?
Currently, each release of Spark Operator is associated with exactly one version of Spark-- Generally the latest minor/patch version. Our organization is using Spark 3.4.2-- released Nov. '23. We would like to use the most recent fixes and features present in Spark Operator without being forced to use a single, specific version of Spark.
Describe the solution you would like
In particular, we would like to see releases of Spark Operator include support for multiple versions of Spark. The length of support is likely a point for debate, but we would propose supporting the most recent patch version of the three most recent minor versions.
Describe alternatives you have considered
Alternatives include either forcing a project to align their spark version with the spark-operator spark version, or forcing the project to build and maintain their own version of the spark-operator docker image.
Additional context
The text was updated successfully, but these errors were encountered:
Community Note
What is the outcome that you are trying to reach?
Currently, each release of Spark Operator is associated with exactly one version of Spark-- Generally the latest minor/patch version. Our organization is using Spark 3.4.2-- released Nov. '23. We would like to use the most recent fixes and features present in Spark Operator without being forced to use a single, specific version of Spark.
Describe the solution you would like
In particular, we would like to see releases of Spark Operator include support for multiple versions of Spark. The length of support is likely a point for debate, but we would propose supporting the most recent patch version of the three most recent minor versions.
Describe alternatives you have considered
Alternatives include either forcing a project to align their spark version with the spark-operator spark version, or forcing the project to build and maintain their own version of the spark-operator docker image.
Additional context
The text was updated successfully, but these errors were encountered: