Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SQLQuery default print error instead of throwing error stack #581

Open
wants to merge 6 commits into
base: master
Choose a base branch
from

Conversation

jackylee-ch
Copy link

The default result of the Spark failure is the error message, no exception throwing until user sets spark_statement_errors_are_fatal. The result of running SQL needs this, too. For most users, the error message is sufficient.

@jackylee-ch
Copy link
Author

cc @itamarst

@PedroRossi
Copy link
Collaborator

PedroRossi commented Oct 18, 2019

Hello @stczwd I am studying the issue on throwing errors instead of printing them on the spark kernel (because I really like/use this project :) ), do you think it could be better if this https://github.com/jupyter-incubator/sparkmagic/blob/master/sparkmagic/sparkmagic/livyclientlib/sqlquery.py#L48 could return something like this https://github.com/jupyter-incubator/sparkmagic/blob/master/sparkmagic/sparkmagic/magics/sparkmagicsbase.py#L44 and the whole idea of

if not success:
    if conf.spark_statement_errors_are_fatal():
        if conf.shutdown_session_on_spark_statement_errors():
            self.spark_controller.cleanup()
        raise SparkStatementException(out)

could be extracted into another function instead of having the same code duplicated in different files.

This whole logic also could be extracted into another function in a different file and become easily reusable, but my whole point is to question where this error throwing should be centralized, I think it should be centralized on the sparkmagicsbase file since he is the one that controls what appears on the screen and now if the errors are fatal instead of only showing and if the session should be shut down upon statement error.

@jackylee-ch
Copy link
Author

@PedroRossi Yeap, this is a good suggestion. I will check it out.

@jackylee-ch
Copy link
Author

@PedroRossi This is a great idea, but it need a lot of changes. I will submit this in a new patch, which we can discuss in it.

@itamarst
Copy link
Contributor

Could you describe the status of this PR now? The goal again, and what it tries to do?

@itamarst itamarst added the awaiting-submitter-response Waiting for answer from person who submitted question label Nov 25, 2019
@jackylee-ch
Copy link
Author

@itamarst This patch is mainly used to control throwing exception in sqlqueries. There will be another patch to handle the problem @PedroRossi talked with me.

Copy link
Collaborator

@devstein devstein left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add tests for this

result = records_to_dataframe(records_text, session.kind, self._coerce)
if conf.spark_statement_errors_are_fatal():
raise SparkStatementException(records_text)
self.ipython_display.send_error(records_text)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If conf.spark_statement_errors_are_fatal() == False and a SparkStatementException is raised then wouldn't this be handled and displayed by handle_expected_exceptions?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
awaiting-submitter-response Waiting for answer from person who submitted question
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants