Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix for enhancement#1003: Google Code-in Task to add a command for di… #256

Open
wants to merge 10 commits into
base: master
Choose a base branch
from

Conversation

Rithi-21
Copy link

@Rithi-21 Rithi-21 commented Jan 13, 2020

…splaying the text from stderr file generated during submission in terminal

Fix for enhancement#1003: Google Code-in Task to add a command for displaying the text from stderr file generated during submission in terminal.

======

This is for Google Code-in Task. Google Code-in Task to add a command for displaying the text from stderr file generated during submission in terminal. I followed with the same feedback provided by mentor Mr. Kartik Verma from the other task "to display stdout file from the remote host, i.e. where the EvalAI has stored the file. request is to make a API call to EvalAI server to get submission details".

Following are the modifications:

modified/added :
modified: main.py to include the submission_error command
modiifed: evalai/utils/submissions.py to display the stderror message
added: submission_error.py file to support the above functionality.

…splaying the text from stderr file generated during submission in terminal

Fix for enhancement#1003: Google Code-in Task to add a command for displaying the text from stderr file generated during submission in terminal.

======

This is for Google Code-in Task. Google Code-in Task to add a command for displaying the text from stderr file generated during submission in terminal. I followed with the same feedback provided by mentor Mr. Kartik Verma for the task to display stdout file from the remote host, i.e. where the EvalAI has stored the file. request is to make a API call to EvalAI server to get submission details.

Following are the modifications:

modified/added :
modified: main.py to include the submission_error command
modiifed: evalai/utils/submissions.py to display the stderror message
added: submission_error.py file to support the above functionality.
@Rithi-21
Copy link
Author

Rithi-21 commented Jan 13, 2020

To Test:

Display submission error message issue the below CLI command:

evalai submission_error 48728

To List all the command issue the below CLI command

evalai —help

evalai/main.py Outdated
@@ -40,6 +41,7 @@ def main(ctx):
main.add_command(push)
main.add_command(set_token)
main.add_command(submission)
main.add_command(submission_error)
Copy link
Member

@krtkvrm krtkvrm Jan 13, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about you nest this under submission command:

$ evalai submission <ID> stderr

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As suggested I updated the changes. please review and let me know in google codein task page.

to test use:

evalai submission 48728 stderr

Added the stderr function in nested under submission command as suggested by Mr. Karthik Verma. Also reverted the changes to main.py and removed the new file submission_error.py
Comment on lines 297 to 304
except requests.exceptions.MissingSchema:
echo(
style(
"\nThe Submission is yet to be evaluated.\n",
bold=True,
fg="yellow",
)
)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you explain why have you used requests.exceptions.MissingSchema.
Going by docs,MissingSchema is for The URL schema (e.g. http or https) is missing. exception.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@vkartik97, I used this exception because as per your earlier note on my previous task “display stdout message in terminal” the stdout file is generated after the submission evaluation is completed. So if the submission id is not available means that it is supposed to be "yet to be evaluated". So I thought that would be right exception error. I just followed the same def function “display_submission_result” code from submissions.py for this one as well. Let me know if you don’t want this exception to be shown, then what exception you are suggesting. Thank you

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can use other exception or use status data from the payload

Siva Neelakantan and others added 3 commits January 18, 2020 19:22
Fixed failing build on PR#256. Google code-in task #1003
Fixed faile PR#256 for google codein task#1003
@pushkalkatara
Copy link

Hi @Rithi-21 Can you please also fix coverage as the part of the build by writing the tests.

PR#256 - Adding Test case for the stderr file output to terminal window. This is for Google code-in task #1003
@pushkalkatara
Copy link

Hi @Rithi-21 Still the coverage is not fixed. It lacks by 0.9%. Please resolve the issue.

fixed code coverage for stderr file and updated the exception to common error.
updated version to fix code coverage to improve 0.9%. Removed exception since it retrieve straight forward URL from s3 bucket. and moreover we are priniting the error message to the terminal and hence removed the exception.
@pushkalkatara
Copy link

Hi @Rithi-21 The PR now look fixed, please reclaim the GCI issue so that we can approve it, as we can see you have abandoned it recently.

@Rithi-21
Copy link
Author

Hi @Rithi-21 The PR now look fixed, please reclaim the GCI issue so that we can approve it, as we can see you have abandoned it recently.

Hello @pushkalkatara, I cannot claim the task anymore since the task claiming deadline is over. anyway I am happy to know that it is good to go. I worked on hard on this and learned many new things from this task. But anyway I will be missing the credit. Thank you for all your support.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants