You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello. I am using the sparkdl in a Spark cluster with YARN integrated with Docker. I am having problems related to user home directory when the codes fetch the preprocessed models (like InceptronV3, XCeptron, etc) and stores it into my HOME_DIR. For advanced reasons, YARN doesn't create the user HOME_DIR, and when the library tries to write into this directory, it fails. What I need to do is to change the default behavior to store models in any directory as I want.
Would it be possible to change code behavior to define the cache directory at execution time? For instance, when I instantiate the following class:
Hello. I am using the sparkdl in a Spark cluster with YARN integrated with Docker. I am having problems related to user home directory when the codes fetch the preprocessed models (like InceptronV3, XCeptron, etc) and stores it into my HOME_DIR. For advanced reasons, YARN doesn't create the user HOME_DIR, and when the library tries to write into this directory, it fails. What I need to do is to change the default behavior to store models in any directory as I want.
Would it be possible to change code behavior to define the cache directory at execution time? For instance, when I instantiate the following class:
Obs.: The file with the HOME_DIR hard-coded is: src/main/scala/com/databricks/sparkdl/ModelFetcher.scala on line 40
Best regards!
The text was updated successfully, but these errors were encountered: