You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on May 31, 2024. It is now read-only.
I found Datamechanics Delight tool online, and started using it locally on my Mac. It seems amazing!
However, while trying to run it from our cluster, it failed and wrote that it "Could not process the Spark Events log. Metrics are unavailable."
We're running Spark 2.4.7 on AWS in standalone mode. The history server is enabled, as well as spark.eventLog.enabled property.
The job does manage to send heartbeats to Delight, and there are no errors in the log (all logs are INFO).
At the end of the log, once the job is finished successfully, I have a message:
[spark-listener-group-shared] INFO co.datamechanics.delight.DelightStreamingConnector- Application will be available in a few minutes on Delight at this url: https://delight.datamechanics.co/apps/some-id
When opening the link, from the overview screen, I am able to see the Spark UI of the history server, but the overview of Delight is empty (see image below).
This is the spark-submit command I'm using (there are some additional configs on the machine itself):
Hi,
I found Datamechanics Delight tool online, and started using it locally on my Mac. It seems amazing!
However, while trying to run it from our cluster, it failed and wrote that it "Could not process the Spark Events log. Metrics are unavailable."
We're running Spark 2.4.7 on AWS in standalone mode. The history server is enabled, as well as
spark.eventLog.enabled
property.The job does manage to send heartbeats to Delight, and there are no errors in the log (all logs are
INFO
).At the end of the log, once the job is finished successfully, I have a message:
When opening the link, from the overview screen, I am able to see the Spark UI of the history server, but the overview of Delight is empty (see image below).
This is the spark-submit command I'm using (there are some additional configs on the machine itself):
How can I solve this issue, in order to use Delight from our cluster?
Thanks,
Lior
The text was updated successfully, but these errors were encountered: