You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When we use the local Maven repo as the first level cache of SBT, if there is a corrupted jar in this repo, it will directly fail.
This problem has been bothering us for a long time, and a similar situation has occurred since the SBT 1.9.4 version.
Also the dependency resolution is delegated to Coursier, so could you report this issue to coursier/coursier as well plz, if you would like to have any action taken?
All of the above can make GA pass, but none of them are perfect.
Especially when we observe some dependencies, it seems that they are not introduced by us in Spark, but rather dependencies of some tools themselves, eg:
This problem has been bothering us for a long time, since version 1.9.4 of SBT.
Besides the methods we tried above, is there a better way to help us get around it?
Thank you very much!
steps
When we use the
local Maven repo
as the first levelcache
ofSBT
, if there is acorrupted jar
in this repo, it will directly fail.This problem has been bothering us for a long time, and a similar situation has occurred since the SBT
1.9.4
version.problem
https://github.com/panbingkun/spark/actions/runs/8105142421/job/22153031672
Starting from SBT version 1.9.3, When there is a corrupted jar in Maven's local repo, it will fail
expectation
When there is a corrupted jar in Maven's local repo, It should not fail, but rather automatically
skip
this cache.notes
The text was updated successfully, but these errors were encountered: