Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue: Gradle run failed. #666

Open
Hritik1100 opened this issue Mar 17, 2022 · 5 comments
Open

Issue: Gradle run failed. #666

Hritik1100 opened this issue Mar 17, 2022 · 5 comments

Comments

@Hritik1100
Copy link

I am completely new to atlas checks. I ran "gradle run" but it's not working. Can you help me out here?

PS C:\WORK\atlas-checks> gradle run

FAILURE: Build failed with an exception.

  • Where:
    Script 'C:\WORK\atlas-checks\gradle\deployment.gradle' line: 32

  • What went wrong:
    Could not compile script 'C:\WORK\atlas-checks\gradle\deployment.gradle'.

startup failed:
script 'C:\WORK\atlas-checks\gradle\deployment.gradle': 32: unable to resolve class MavenDeployment
@ line 32, column 13.
{
^

@Bentleysb
Copy link
Collaborator

Hi there,
thanks for checking out the project. There are a couple of things that might help you here. This project relies on having a specific version of gradle. To make it easy to keep track of this the correct version is packaged in the project. Instead of just calling gradle run try using gradlew run or gradlew.bat run from the project root. The other thing is that this project does not always play very nice with windows os. Most windows users have found using docker works best. There are instructions to do that here: https://github.com/osmlab/atlas-checks/blob/dev/docs/docker.md

@Ichchhie
Copy link

Screenshot_655

Hi, @Bentleysb I was calling gradlew run and got the above issue. I am using Windows.

@Bentleysb
Copy link
Collaborator

Hey @Ichchhie,
It looks like your environment is using an incompatible text encoding. OSM is all utf8 encoded, so that encoding is required to process OSM data through atlas checks. You could try setting your default system encoding, or using docker to get atlas checks running with utf8. Otherwise some additional parameters may need to be added to gradle.

@Ichchhie
Copy link

Thanks @Bentleysb , adding it to gradle.properties worked. However, I am still facing some other issues, have kept the complete process log below:

Task :compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

Task :run
null
-configFiles=file://F:\kll\atlas-checks/config/configuration.json
-outputFormats=geojson
-cluster=local
-countries=BLZ
-input=file://F:\kll\atlas-checks\build/example/data/atlas/
-savePbfAtlas=false
-sparkOptions=spark.executor.memory->4g,spark.driver.memory->4g,spark.rdd.compress->true
-output=file://F:\kll\atlas-checks\build/example/data/output/
-startedFolder=file://F:\kll\atlas-checks\build/example/tmp/
-sharded=true
-multiAtlas=true
-compressOutput=true \

Task :runChecks FAILED
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/HP/.gradle/caches/modules-2/files-2.1/ch.qos.logback/logback-classic/1.1.3/d90276fff414f06cb375f2057f6778cd63c6082f/logback-classic-1.1.3.
jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/HP/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-log4j12/1.7.30/c21f55139d8141d2231214fb1feaf50a1edca95e/slf4j-log4j12-1.7.30.jar!/or
g/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
11:55:28.250 [main] INFO o.o.atlas.utilities.runtime.Command - Parsing switch configFiles -> file://F:\kll\atlas-checks/config/configuration.json
11:55:28.250 [main] INFO o.o.atlas.utilities.runtime.Command - Parsing switch outputFormats -> geojson
11:55:28.250 [main] INFO o.o.atlas.utilities.runtime.Command - Parsing switch cluster -> local
11:55:28.250 [main] INFO o.o.atlas.utilities.runtime.Command - Parsing switch countries -> BLZ
11:55:28.250 [main] INFO o.o.atlas.utilities.runtime.Command - Parsing switch input -> file://F:\kll\atlas-checks\build/example/data/atlas/
11:55:28.250 [main] INFO o.o.atlas.utilities.runtime.Command - Parsing switch savePbfAtlas -> false
11:55:28.250 [main] INFO o.o.atlas.utilities.runtime.Command - Parsing switch sparkOptions -> spark.executor.memory->4g,spark.driver.memory->4g,spark.rdd.compress->true
11:55:28.250 [main] INFO o.o.atlas.utilities.runtime.Command - Parsing switch output -> file://F:\kll\atlas-checks\build/example/data/output/
11:55:28.250 [main] WARN o.o.atlas.utilities.runtime.Command - Unknown switch startedFolder -> file://F:\kll\atlas-checks\build/example/tmp/
11:55:28.250 [main] WARN o.o.atlas.utilities.runtime.Command - Unknown switch sharded -> true
11:55:28.250 [main] INFO o.o.atlas.utilities.runtime.Command - Parsing switch multiAtlas -> true
11:55:28.250 [main] INFO o.o.atlas.utilities.runtime.Command - Parsing switch compressOutput -> true
11:55:28.250 [main] WARN o.o.atlas.utilities.runtime.Command - Running without switch [Switch: name = additionalSparkOptions, description = Comma separated list of additional Spark
options, i.e. key1->value1,key2->value2 Default is: ]
11:55:28.265 [main] WARN o.o.atlas.utilities.runtime.Command - Running without switch [Switch: name = sparkContextProvider, description = The class name of the Spark Context Provide
r Default is: org.openstreetmap.atlas.generator.tools.spark.context.DefaultSparkContextProvider]
11:55:28.265 [main] WARN o.o.atlas.utilities.runtime.Command - Running without switch [Switch: name = inputFolder, description = Path of folder which contains Atlas file(s)]
11:55:28.265 [main] WARN o.o.atlas.utilities.runtime.Command - Running without switch [Switch: name = maproulette, description = Map roulette server information, format :<Port
::, projectName is optional.]
11:55:28.265 [main] WARN o.o.atlas.utilities.runtime.Command - Running without switch [Switch: name = configJson, description = Json formatted configuration.]
11:55:28.265 [main] WARN o.o.atlas.utilities.runtime.Command - Running without switch [Switch: name = pbfBoundingBox, description = OSM protobuf data will be loaded only in this bou
nding box]
11:55:28.265 [main] WARN o.o.atlas.utilities.runtime.Command - Running without switch [Switch: name = checkFilter, description = Comma-separated list of checks to run]
11:55:28.265 [main] WARN o.o.atlas.utilities.runtime.Command - Running without switch [Switch: name = maxPoolMinutes, description = Maximum number of minutes for pool duration.]
11:55:28.265 [main] WARN o.o.atlas.utilities.runtime.Command - Running without switch [Switch: name = externalDataInput, description = Path to the root location that is common to al
l external data]
11:55:28.265 [main] WARN o.o.atlas.utilities.runtime.Command - Running without switch [Switch: name = shardBufferDistance, description = Distance to expand the bounds of the shard g
roup to create a network in kilometers Default is: 10.0]
11:55:28.265 [main] WARN o.o.atlas.utilities.runtime.Command - Running without switch [Switch: name = sharding, description = Sharding to load in place of sharding file in Atlas pat
h]
11:55:28.800 [main] ERROR o.o.atlas.utilities.runtime.Command - Command execution failed.
java.lang.ExceptionInInitializerError: null
at org.apache.spark.unsafe.array.ByteArrayMethods.(ByteArrayMethods.java:54) ~[spark-unsafe_2.12-3.0.1.jar:3.0.1]
at org.apache.spark.internal.config.package$.(package.scala:1006) ~[spark-core_2.12-3.0.1.jar:3.0.1]
at org.apache.spark.internal.config.package$.(package.scala) ~[spark-core_2.12-3.0.1.jar:3.0.1]
at org.apache.spark.SparkConf$.(SparkConf.scala:639) ~[spark-core_2.12-3.0.1.jar:3.0.1]
at org.apache.spark.SparkConf$.(SparkConf.scala) ~[spark-core_2.12-3.0.1.jar:3.0.1]
at org.apache.spark.SparkConf.set(SparkConf.scala:94) ~[spark-core_2.12-3.0.1.jar:3.0.1]
at org.apache.spark.SparkConf.set(SparkConf.scala:83) ~[spark-core_2.12-3.0.1.jar:3.0.1]
at org.apache.spark.SparkConf.setAppName(SparkConf.scala:120) ~[spark-core_2.12-3.0.1.jar:3.0.1]
at org.openstreetmap.atlas.generator.tools.spark.SparkJob.onRun(SparkJob.java:172) ~[atlas-generator-5.3.6.jar:na]
at org.openstreetmap.atlas.utilities.runtime.Command.execute(Command.java:338) ~[atlas-7.0.6.jar:na]
at org.openstreetmap.atlas.utilities.runtime.Command.run(Command.java:282) ~[atlas-7.0.6.jar:na]
at org.openstreetmap.atlas.checks.distributed.ShardedIntegrityChecksSparkJob.main(ShardedIntegrityChecksSparkJob.java:103) ~[main/:na]
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make private java.nio.DirectByteBuffer(long,int) accessible: module java.base does not "opens java.nio" to unnamed
module @70e9c95d
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:354) ~[na:na]
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297) ~[na:na]
at java.base/java.lang.reflect.Constructor.checkCanSetAccessible(Constructor.java:188) ~[na:na]
at java.base/java.lang.reflect.Constructor.setAccessible(Constructor.java:181) ~[na:na]
at org.apache.spark.unsafe.Platform.(Platform.java:56) ~[spark-unsafe_2.12-3.0.1.jar:3.0.1]
... 12 common frames omitted

FAILURE: Build failed with an exception.

  • What went wrong:
    Execution failed for task ':runChecks'.

Process 'command 'C:\Program Files\Java\jdk-17.0.2\bin\java.exe'' finished with non-zero exit value 1

@Bentleysb
Copy link
Collaborator

@Ichchhie, it looks like your java version is too new. This project requires jdk 11.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants