Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support to turn off auto refresh #7

Closed
barunpuri opened this issue Feb 1, 2024 · 4 comments
Closed

Support to turn off auto refresh #7

barunpuri opened this issue Feb 1, 2024 · 4 comments
Assignees

Comments

@barunpuri
Copy link

Describe the bug
A clear and concise description of what the bug is.
It's not a bug, but improvement

Environemnt
spark verison: 3.2/
platform: standalone

To Reproduce
Steps to reproduce the behavior:

  1. Submit spark query
  2. Open spark history page
  3. Open Dataflint
  4. It refresh page automatically

Expected behavior
Could Turn off auto-refresh

Screenshots

Screen.Recording.2024-02-01.at.5.46.42.PM.mov

As you can see, data has been updating

Additional context
Add any other context about the problem here.

@menishmueli
Copy link
Contributor

menishmueli commented Feb 1, 2024

Hi @barunpuri!

The source of the issue is not "auto refresh", as when DataFlint in history server mode it doesn't refresh (there is no reason to).
There is a server error and the error is not indicative (because it's the same error as when using DataFlint on spark driver and the server stopped responding, for example because the spark session ended)

Can you please in your browser open the inspect window, and share the output of the console and network tabs? and for any failed network request, show the error?

Thanks!

@barunpuri
Copy link
Author

barunpuri commented Feb 1, 2024

Hi @barunpuri!

The source of the issue is not "auto refresh", as when DataFlint in history server mode it doesn't refresh (there is no reason to). There is a server error and the error is not indicative (because it's the same error as when using DataFlint on spark driver and the server stopped responding, for example because the spark session ended)

Can you please in your browser open the inspect window, and share the output of the console and network tabs? and for any failed network request, show the error?

Thanks!

Well I see.
Thank you for response.
Request for history server failed.
Request
GET http://___history_server___/proxy/___application_id____/api/v1/applications/___application_id____/sql?offset=0&length=1000&planDescription=false
Response

<html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 500 Request failed.</title>
</head>
<body><h2>HTTP ERROR 500 Request failed.</h2>
<table>
<tr><th>URI:</th><td>/api/v1/applications/___application_id____/sql</td></tr>
<tr><th>STATUS:</th><td>500</td></tr>
<tr><th>MESSAGE:</th><td>Request failed.</td></tr>
<tr><th>SERVLET:</th><td>org.glassfish.jersey.servlet.ServletContainer-49154699</td></tr>
</table>
<hr><a href="https://eclipse.org/jetty">Powered by Jetty:// 9.4.43.v20210629</a><hr/>

</body>
</html>

@menishmueli
Copy link
Contributor

The request that is failing with 500 is GET request for an official Spark REST API endpoint
image
(Source: https://spark.apache.org/docs/latest/monitoring.html)

I got multiple reports so far that this request tends to sometimes fail in spark 3.2, seems like there is a bug in spark that was solved in later versions.

If it's possible, you could upgrade your history server version to the latest spark version (3.5) and I believe it would solve the problem. History server is backward compatible so it could show you runs that was submitted with spark 3.2.

Could you also please look at your history server logs and share the exception that resulted in error 500?

@barunpuri
Copy link
Author

barunpuri commented Feb 2, 2024

The request that is failing with 500 is GET request for an official Spark REST API endpoint image (Source: spark.apache.org/docs/latest/monitoring.html)

I got multiple reports so far that this request tends to sometimes fail in spark 3.2, seems like there is a bug in spark that was solved in later versions.

If it's possible, you could upgrade your history server version to the latest spark version (3.5) and I believe it would solve the problem. History server is backward compatible so it could show you runs that was submitted with spark 3.2.

Could you also please look at your history server logs and share the exception that resulted in error 500?

Thank you for your kind response.
For now, it's hard to test with 3.5 because a different team manages it.
So if it is resolved this issue could be closed. Thank you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants