Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Example ENABLE_DAEMON_INIT submit step #114

Open
noahkawasakigoogle opened this issue Jan 10, 2021 · 0 comments
Open

Example ENABLE_DAEMON_INIT submit step #114

noahkawasakigoogle opened this issue Jan 10, 2021 · 0 comments

Comments

@noahkawasakigoogle
Copy link

Hello, this is a feature request, for those of us not experts in how Spark works, to add in an example of an additional step to run with the ENABLE_DAEMON_INIT flag.

For myself in particular, I am hoping this could be used to start the HiveThrift server upon starting the containers so I can use this for JDBC testing in CI from outside of the container as well as a simple ETL script to create some tables and load a small dataset.

I have found that this command lets me start up the thrift server:

cd /spark/bin && /spark/sbin/../bin/spark-class org.apache.spark.deploy.SparkSubmit --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 spark-internal

So I will probably docker exec it before starting the test runner. But it would be cool to see if this is possible to include in the initialization of the container.

Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant