Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix part of #12912: Enabling linter consider-using-with #20134

Closed
wants to merge 11 commits into from

Conversation

m0nax5
Copy link

@m0nax5 m0nax5 commented Apr 6, 2024

Overview

  1. This PR fixes or fixes part of Fix pylint problems #12912.
  2. This PR does the following: Enables linter consider-using-with again by fixing every instance of linter check failure in the existing codebase, by introducing with statements when they are needed.

Essential Checklist

Please follow the instructions for making a code change.

  • I have linked the issue that this PR fixes in the "Development" section of the sidebar.
  • I have checked the "Files Changed" tab and confirmed that the changes are what I want to make.
  • I have written tests for my code.
  • The PR title starts with "Fix #bugnum: " or "Fix part of #bugnum: ...", followed by a short, clear summary of the changes.
  • I have assigned the correct reviewers to this PR (or will leave a comment with the phrase "@{{reviewer_username}} PTAL" if I can't assign them directly).

Proof that changes are correct

After removing 'consider-using-with' from the disabled linters, the linter output was :

linter_checks_fail

Now after the changes from this PR all linter checks pass :

linter_checks_pass

PR Pointers

@m0nax5 m0nax5 requested review from a team as code owners April 6, 2024 16:13
Copy link

oppiabot bot commented Apr 6, 2024

Assigning @lkbhitesh07 for the first pass review of this PR. Thanks!

@lkbhitesh07
Copy link
Member

Hey @m0nax5 thanks for the PR, could you please fix the failing tests and then assign me again?
Thanks

@lkbhitesh07 lkbhitesh07 assigned m0nax5 and unassigned lkbhitesh07 Apr 10, 2024
@m0nax5
Copy link
Author

m0nax5 commented Apr 10, 2024

Hi ! Yes I'm currently working on it

@m0nax5 m0nax5 requested review from a team as code owners April 13, 2024 22:19
@m0nax5
Copy link
Author

m0nax5 commented Apr 16, 2024

Hi @lkbhitesh07, I fixed the linter issues that were about consider-using-with, but it seems that I have introduced another issue along the way. I believe it was introduced in this commit, where only the extend_index_yalm_test.py file was modified, since it is the first time that this issue appeared in the lint tests, which I ran after each code modification.
I ran : PYTHONIOENCODING=utf-8 python -m scripts.linters.run_lint_checks --shard other --verbose and obtained the following error :

Unable to run the complete lint test, please check the following stack trace and fix the errors:
+--------------------------+
Traceback (most recent call last):
  File "opensource2/oppia/scripts/linters/general_purpose_linter.py", line 434, in _check_for_mandatory_pattern_in_file
    file_content = self.file_cache.readlines(filepath)
  File "opensource2/oppia/scripts/linters/run_lint_checks.py", line 165, in readlines
    _, line_by_line_content = self._get_data(filepath, mode)
  File "opensource2/oppia/scripts/linters/run_lint_checks.py", line 188, in _get_data
    lines = f.readlines()
  File ".pyenv/versions/3.8.15/lib/python3.8/codecs.py", line 322, in decode
    (result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xd9 in position 99: invalid continuation byte

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "opensource2/oppia/scripts/concurrent_task_utils.py", line 115, in run
    self.task_results = self.func()
  File "opensource2/oppia/scripts/linters/general_purpose_linter.py", line 638, in perform_all_lint_checks
    self.check_mandatory_patterns(), self.check_bad_patterns(),
  File "opensource2/oppia/scripts/linters/general_purpose_linter.py", line 474, in check_mandatory_patterns
    self._check_for_mandatory_pattern_in_file(
  File "opensource2/oppia/scripts/linters/general_purpose_linter.py", line 436, in _check_for_mandatory_pattern_in_file
    raise Exception('%s %s' % (filepath, e)) from e
Exception: .coverage.DESKTOP-G4UNK6D.9743.881541 'utf-8' codec can't decode byte 0xd9 in position 99: invalid continuation byte

--------------------------------------------------

Traceback (most recent call last):
  File "opensource2/oppia/scripts/linters/general_purpose_linter.py", line 434, in _check_for_mandatory_pattern_in_file
    file_content = self.file_cache.readlines(filepath)
  File "opensource2/oppia/scripts/linters/run_lint_checks.py", line 165, in readlines
    _, line_by_line_content = self._get_data(filepath, mode)
  File "opensource2/oppia/scripts/linters/run_lint_checks.py", line 188, in _get_data
    lines = f.readlines()
  File ".pyenv/versions/3.8.15/lib/python3.8/codecs.py", line 322, in decode
    (result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xd9 in position 99: invalid continuation byte

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "opensource2/oppia/scripts/concurrent_task_utils.py", line 115, in run
    self.task_results = self.func()
  File "opensource2/oppia/scripts/linters/general_purpose_linter.py", line 638, in perform_all_lint_checks
    self.check_mandatory_patterns(), self.check_bad_patterns(),
  File "opensource2/oppia/scripts/linters/general_purpose_linter.py", line 474, in check_mandatory_patterns
    self._check_for_mandatory_pattern_in_file(
  File "opensource2/oppia/scripts/linters/general_purpose_linter.py", line 436, in _check_for_mandatory_pattern_in_file
    raise Exception('%s %s' % (filepath, e)) from e
Exception: .coverage.DESKTOP-G4UNK6D.9743.881541 'utf-8' codec can't decode byte 0xd9 in position 99: invalid continuation byte

--------------------------------------------------

--------------------------------------------------
Some of the linting functions may not run until the above errors gets fixed
---------------------------
Linter Checks Failed.
---------------------------

I looked into it and from what I understand there is error in finding filepaths, I believe due to an issue in decoding utf8. A similar error also occurred in the backend tests that automatically ran after the last time I pushed my code : each time there is a wrong folder name in the middle of the filepath such as in /home/runner/work/oppia/oppia/core/_8k8huiz/backend_file.py or /home/runner/work/oppia/oppia/core/w1hjl_1q/backend_file.py. (see traceback below)

======================================================================
ERROR: test_checks_fail_when_a_backend_file_lacks_associated_test_file (scripts.check_backend_associated_test_file_test.CheckBackendAssociatedTestFileTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/runner/work/oppia/oppia/scripts/check_backend_associated_test_file_test.py", line 53, in test_checks_fail_when_a_backend_file_lacks_associated_test_file
    with open(backend_file, 'w', encoding='utf8') as f:
FileNotFoundError: [Errno 2] No such file or directory: '/home/runner/work/oppia/oppia/core/_8k8huiz/backend_file.py'

======================================================================
ERROR: test_pass_when_file_in_exclusion_list_lacks_associated_test (scripts.check_backend_associated_test_file_test.CheckBackendAssociatedTestFileTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/runner/work/oppia/oppia/scripts/check_backend_associated_test_file_test.py", line 75, in test_pass_when_file_in_exclusion_list_lacks_associated_test
    with open(backend_file, 'w', encoding='utf8') as f:
FileNotFoundError: [Errno 2] No such file or directory: '/home/runner/work/oppia/oppia/core/w1hjl_1q/backend_file.py'

I don't really know how to handle this issue, except from the fact that the problem comes from utf8 decoding and that it probably stems from my changes to the file scripts/extend_index_yaml_test.py. Could you please give me some guidance regarding the issue ?
Thanks a lot !

Copy link
Contributor

@StephenYu2018 StephenYu2018 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

Copy link

oppiabot bot commented Apr 16, 2024

Assigning @U8NWXD, @lkbhitesh07 for code owner reviews. Thanks!

@U8NWXD
Copy link
Member

U8NWXD commented Apr 17, 2024

Since this PR is going to change when you fix the CI checks, please re-assign me for review once those are passing

@U8NWXD U8NWXD removed their assignment Apr 17, 2024
@m0nax5
Copy link
Author

m0nax5 commented Apr 19, 2024

Hi @U8NWXD, I have tried several different syntaxes to solve my UTF-8 encoding issue in extend_index_yalm_test.py, but I haven’t been able to find a solution that passes the CI tests.
For instance I have tried things like : (diff compared to my last pushed code)

diff --git a/scripts/extend_index_yaml_test.py b/scripts/extend_index_yaml_test.py
index 4264837a3d..2523cfd479 100644
--- a/scripts/extend_index_yaml_test.py
+++ b/scripts/extend_index_yaml_test.py
@@ -19,6 +19,7 @@
 from __future__ import annotations

 import tempfile
+import unittest

 from core.tests import test_utils

@@ -168,38 +169,51 @@ class ReformatXmlToYamlTests(test_utils.GenericTestBase):
         )


-class ExtendIndexYamlTests(test_utils.GenericTestBase):
+class ExtendIndexYamlTests(unittest.TestCase):
     """Class for testing the extend_index_yaml script."""

     def _run_test_for_extend_index_yaml(
         self, index_yaml: str, web_inf_index_xml: str, expected_index_yaml: str
     ) -> None:
         """Run tests for extend_index_yaml script."""
-        with tempfile.NamedTemporaryFile() as index_yaml_file:
-            with tempfile.NamedTemporaryFile() as web_inf_index_xml_file:
-                with self.swap(
-                    extend_index_yaml, 'INDEX_YAML_PATH',
-                    index_yaml_file.name
-                ):
-                    with self.swap(
-                        extend_index_yaml, 'WEB_INF_INDEX_XML_PATH',
-                        web_inf_index_xml_file.name
-                    ):
-                        with open(
-                            index_yaml_file.name, 'w', encoding='utf-8'
-                        ) as open_index_yaml_w:
-                            open_index_yaml_w.write(index_yaml)
-                        with open(
-                            web_inf_index_xml_file.name, 'a', encoding='utf-8'
-                        ) as open_web_inf_index_xml:
-                            open_web_inf_index_xml.write(web_inf_index_xml)
-                        extend_index_yaml.main()
-                        with open(
-                            index_yaml_file.name, 'r', encoding='utf-8'
-                        ) as open_index_yaml_r:
-                            actual_index_yaml = open_index_yaml_r.read()
-                        self.assertEqual(
-                            actual_index_yaml, expected_index_yaml)
+        original_yaml_pth = extend_index_yaml.INDEX_YAML_PATH
+        original_xml_pth = extend_index_yaml.WEB_INF_INDEX_XML_PATH
+
+        with tempfile.NamedTemporaryFile(
+            mode='w+', encoding='utf-8'
+        ) as tmp_index_yaml:
+            extend_index_yaml.INDEX_YAML_PATH = tmp_index_yaml.name
+            with tempfile.NamedTemporaryFile(
+                mode='w+', encoding='utf-8'
+            ) as tmp_index_xml:
+                extend_index_yaml.WEB_INF_INDEX_XML_PATH = tmp_index_xml.name
+
+                try:
+                    with open(
+                        extend_index_yaml.INDEX_YAML_PATH, 'w',
+                        encoding='utf-8'
+                    ) as f:
+                        f.write(index_yaml)
+
+                    with open(
+                        extend_index_yaml.WEB_INF_INDEX_XML_PATH, 'a',
+                        encoding='utf-8'
+                    ) as f:
+                        f.write(web_inf_index_xml)
+
+                    extend_index_yaml.main()
+
+                    with open(
+                        extend_index_yaml.INDEX_YAML_PATH, 'r',
+                        encoding='utf-8'
+                    ) as f:
+                        actual_index_yaml = f.read()
+
+                    self.assertEqual(actual_index_yaml, expected_index_yaml)
+
+                finally:
+                    extend_index_yaml.INDEX_YAML_PATH = original_yaml_pth
+                    extend_index_yaml.WEB_INF_INDEX_XML_PATH = original_xml_pth

     def test_extend_index_yaml_with_changes(self) -> None:
         index_yaml = """indexes:

The same kind of failures as in my previous comment appeared in each of my attempts to run the linter checks, ie. unable to find filepaths due to unreadable UTF-8 characters. Yet appart from that the lint tests pass.

I would really appreciate if you could provide me with some guidance to tackle this, especially if you have any idea why some UTF-8 characters are not decoded well (since I didn’t change anything in the bits related to UTF-8 encoding). Thanks for your help !

@U8NWXD
Copy link
Member

U8NWXD commented Apr 19, 2024

From Exception: .coverage.DESKTOP-G4UNK6D.9743.881541 'utf-8' codec can't decode byte 0xd9 in position 99: invalid continuation byte, it looks like the linter is trying to read a .coverage* file, which is generated by the backend coverage checks (specifically coverage.py). These coverage files are binary, so it makes sense that trying to decode them using utf-8 will fail.

I recommend investigating why the linter is trying to read this file even though we should be excluding it from the lint checks since it's not a code file

@U8NWXD U8NWXD removed their assignment Apr 19, 2024
@m0nax5
Copy link
Author

m0nax5 commented Apr 20, 2024

Thanks for the tip ! You were right I was able to pass all linter checks once I had manually removed all the .coverage* files that had been created when I ran PYTHONIOENCODING=utf-8 python -m scripts.run_backend_tests --generate_coverage_report --ignore_coverage --exclude_load_tests --test_shard 5 since it had the --generate_coverage_report flag. I must still look into why these files are not ignored in by the linter tests. I'm also now working on the utf8 issues raised by these backend tests.

Copy link
Member

@lkbhitesh07 lkbhitesh07 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @m0nax5, LGTM for the code owner files. Thanks for the changes, I have left a comment regarding the failed backend tests. I hope it will help!
Thanks!

frontend_file = os.path.join(tempdir.name, 'frontend_file.ts')
with tempfile.TemporaryDirectory(
prefix=os.getcwd() + '/core/') as tempdir:
backend_file = os.path.join(tempdir, 'backend_file.py')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @m0nax5, for the backend test, I think here it should be tempdir.name, could you please check if after this the backend tests passes, same changes goes for the other backend test below as well.
Thanks

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @lkbhitesh07, I implemented the changes that you suggested accross the file and the backend tests still won't pass. Here is the log I got for the backend tests :

======================================================================
ERROR: test_checks_fail_when_a_backend_file_lacks_associated_test_file (scripts.check_backend_associated_test_file_test.CheckBackendAssociatedTestFileTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/opensource2/oppia/scripts/check_backend_associated_test_file_test.py", line 54, in test_checks_fail_when_a_backend_file_lacks_associated_test_file
    with open(backend_file, 'w', encoding='utf8') as f:
FileNotFoundError: [Errno 2] No such file or directory: '/home/opensource2/oppia/core/z0twcbrp/backend_file.py'

======================================================================
ERROR: test_checks_pass_when_all_backend_files_have_an_associated_test_file (scripts.check_backend_associated_test_file_test.CheckBackendAssociatedTestFileTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/opensource2/oppia/scripts/check_backend_associated_test_file_test.py", line 97, in test_checks_pass_when_all_backend_files_have_an_associated_test_file
    backend_file = os.path.join(tempdir.name, 'backend_file.py')
AttributeError: 'str' object has no attribute 'name'

======================================================================
ERROR: test_pass_when_file_in_exclusion_list_lacks_associated_test (scripts.check_backend_associated_test_file_test.CheckBackendAssociatedTestFileTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/opensource2/oppia/scripts/check_backend_associated_test_file_test.py", line 76, in test_pass_when_file_in_exclusion_list_lacks_associated_test
    backend_file = os.path.join(tempdir.name, 'backend_file.py')
AttributeError: 'str' object has no attribute 'name'

---------------------------------------------------------------------

The first error is an occurrence of the previous syntax without .name that I left on purpose, whereas the two last errors are what I get when I add .name. What I don't understand is why when using the with clause tempdir has type str, whereas when it is declared with tempdir = tempfile.TemporaryDirectory() it has type tempfile.TemporaryDirectory and thus tempdir.name makes sense and has type str. I'm currently looking into finding a solution to this issue.
Thanks again for the review !

@lkbhitesh07 lkbhitesh07 removed their assignment Apr 27, 2024
Copy link

oppiabot bot commented Apr 27, 2024

Assigning @U8NWXD for code owner reviews. Thanks!

@U8NWXD
Copy link
Member

U8NWXD commented Apr 27, 2024

De-assigning myself for now since this PR still needs CI fixes

@U8NWXD U8NWXD removed their assignment Apr 27, 2024
Copy link

oppiabot bot commented May 4, 2024

Hi @m0nax5, I'm going to mark this PR as stale because it hasn't had any updates for 7 days. If no further activity occurs within 4 days, it will be automatically closed so that others can take up the issue.
If you are still working on this PR, please make a follow-up commit within 4 days (and submit it for review, if applicable). Please also let us know if you are stuck so we can help you!

@oppiabot oppiabot bot added stale and removed stale labels May 4, 2024
Copy link

oppiabot bot commented May 12, 2024

Hi @m0nax5, I'm going to mark this PR as stale because it hasn't had any updates for 7 days. If no further activity occurs within 4 days, it will be automatically closed so that others can take up the issue.
If you are still working on this PR, please make a follow-up commit within 4 days (and submit it for review, if applicable). Please also let us know if you are stuck so we can help you!

@oppiabot oppiabot bot added stale and removed stale labels May 12, 2024
Copy link

oppiabot bot commented May 19, 2024

Hi @m0nax5, I'm going to mark this PR as stale because it hasn't had any updates for 7 days. If no further activity occurs within 4 days, it will be automatically closed so that others can take up the issue.
If you are still working on this PR, please make a follow-up commit within 4 days (and submit it for review, if applicable). Please also let us know if you are stuck so we can help you!

@oppiabot oppiabot bot added the stale label May 19, 2024
@seanlip seanlip mentioned this pull request May 22, 2024
9 tasks
@oppiabot oppiabot bot closed this May 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants