Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG : Action Fails #425

Open
ddok2 opened this issue Mar 13, 2023 · 42 comments · May be fixed by #449 or #464
Open

BUG : Action Fails #425

ddok2 opened this issue Mar 13, 2023 · 42 comments · May be fixed by #449 or #464
Assignees
Labels
bug Something isn't working

Comments

@ddok2
Copy link

ddok2 commented Mar 13, 2023

Describe the bug
image
image

Actions get failed from a month...

image

Traceback (most recent call last):
  File "/waka-readme-stats/main.py", line 221, in <module>
    run(main())
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/waka-readme-stats/main.py", line 208, in main
    stats = await get_stats()
            ^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/main.py", line 156, in get_stats
    yearly_data, commit_data = await calculate_commit_data(repositories)
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/yearly_commit_calculator.py", line [37](https://github.com/ddok2/ddok2/actions/runs/4403414537/jobs/7711737364#step:3:38), in calculate_commit_data
    await update_data_with_commit_stats(repo, yearly_data, date_data)
  File "/waka-readme-stats/yearly_commit_calculator.py", line 65, in update_data_with_commit_stats
    date = search(r"\d+-\d+-\d+", commit["committedDate"]).group()
                                  ~~~~~~^^^^^^^^^^^^^^^^^
TypeError: 'NoneType' object is not subscriptable
sys:1: RuntimeWarning: coroutine 'AsyncClient.get' was never awaited

Github repository link

@ddok2 ddok2 added the bug Something isn't working label Mar 13, 2023
@eby8zevin
Copy link

Same problem

  File "/waka-readme-stats/main.py", line 222, in <module>
    run(main())
  File "/usr/local/lib/python3.9/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete
    return future.result()
  File "/waka-readme-stats/main.py", line 208, in main
    stats = await get_stats()
  File "/waka-readme-stats/main.py", line 156, in get_stats
    yearly_data, commit_data = await calculate_commit_data(repositories)
  File "/waka-readme-stats/yearly_commit_calculator.py", line [37](https://github.com/eby8zevin/eby8zevin/actions/runs/4398547065/jobs/7702477607#step:3:38), in calculate_commit_data
    await update_data_with_commit_stats(repo, yearly_data, date_data)
  File "/waka-readme-stats/yearly_commit_calculator.py", line 65, in update_data_with_commit_stats
    date = search(r"\d+-\d+-\d+", commit["committedDate"]).group()
TypeError: 'NoneType' object is not subscriptable
sys:1: RuntimeWarning: coroutine 'AsyncClient.get' was never awaited```

@Fanduzi
Copy link

Fanduzi commented Mar 14, 2023

same issue

@moncheeta
Copy link

I also have the same problem. For me, it has been occuring for the past few days.

@yanskun
Copy link

yanskun commented Mar 19, 2023

It has been successful for a few days now.
Thanks for the fix.

@mikebronner
Copy link

mikebronner commented Mar 19, 2023

Mine has not worked once since 11 Mar 2023.

Run anmol098/waka-readme-stats@master
[53](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:54)
/usr/bin/docker run --name wakareadmestatswakareadmestatsmaster_f5eb8b --label 6c0442 --workdir /github/workspace --rm -e "INPUT_WAKATIME_API_KEY" -e "INPUT_GH_TOKEN" -e "INPUT_SHOW_PROJECTS" -e "INPUT_SECTION_NAME" -e "INPUT_PULL_BRANCH_NAME" -e "INPUT_PUSH_BRANCH_NAME" -e "INPUT_SHOW_OS" -e "INPUT_SHOW_EDITORS" -e "INPUT_SHOW_TIMEZONE" -e "INPUT_SHOW_COMMIT" -e "INPUT_SHOW_LANGUAGE" -e "INPUT_SHOW_LINES_OF_CODE" -e "INPUT_SHOW_LANGUAGE_PER_REPO" -e "INPUT_SHOW_LOC_CHART" -e "INPUT_SHOW_DAYS_OF_WEEK" -e "INPUT_SHOW_PROFILE_VIEWS" -e "INPUT_SHOW_SHORT_INFO" -e "INPUT_SHOW_UPDATED_DATE" -e "INPUT_SHOW_TOTAL_CODE_TIME" -e "INPUT_COMMIT_BY_ME" -e "INPUT_COMMIT_MESSAGE" -e "INPUT_COMMIT_USERNAME" -e "INPUT_COMMIT_EMAIL" -e "INPUT_COMMIT_SINGLE" -e "INPUT_LOCALE" -e "INPUT_UPDATED_DATE_FORMAT" -e "INPUT_IGNORED_REPOS" -e "INPUT_SYMBOL_VERSION" -e "INPUT_DEBUG_LOGGING" -e "HOME" -e "GITHUB_JOB" -e "GITHUB_REF" -e "GITHUB_SHA" -e "GITHUB_REPOSITORY" -e "GITHUB_REPOSITORY_OWNER" -e "GITHUB_REPOSITORY_OWNER_ID" -e "GITHUB_RUN_ID" -e "GITHUB_RUN_NUMBER" -e "GITHUB_RETENTION_DAYS" -e "GITHUB_RUN_ATTEMPT" -e "GITHUB_REPOSITORY_ID" -e "GITHUB_ACTOR_ID" -e "GITHUB_ACTOR" -e "GITHUB_TRIGGERING_ACTOR" -e "GITHUB_WORKFLOW" -e "GITHUB_HEAD_REF" -e "GITHUB_BASE_REF" -e "GITHUB_EVENT_NAME" -e "GITHUB_SERVER_URL" -e "GITHUB_API_URL" -e "GITHUB_GRAPHQL_URL" -e "GITHUB_REF_NAME" -e "GITHUB_REF_PROTECTED" -e "GITHUB_REF_TYPE" -e "GITHUB_WORKFLOW_REF" -e "GITHUB_WORKFLOW_SHA" -e "GITHUB_WORKSPACE" -e "GITHUB_ACTION" -e "GITHUB_EVENT_PATH" -e "GITHUB_ACTION_REPOSITORY" -e "GITHUB_ACTION_REF" -e "GITHUB_PATH" -e "GITHUB_ENV" -e "GITHUB_STEP_SUMMARY" -e "GITHUB_STATE" -e "GITHUB_OUTPUT" -e "RUNNER_OS" -e "RUNNER_ARCH" -e "RUNNER_NAME" -e "RUNNER_TOOL_CACHE" -e "RUNNER_TEMP" -e "RUNNER_WORKSPACE" -e "ACTIONS_RUNTIME_URL" -e "ACTIONS_RUNTIME_TOKEN" -e "ACTIONS_CACHE_URL" -e GITHUB_ACTIONS=true -e CI=true -v "/var/run/docker.sock":"/var/run/docker.sock" -v "/home/runner/work/_temp/_github_home":"/github/home" -v "/home/runner/work/_temp/_github_workflow":"/github/workflow" -v "/home/runner/work/_temp/_runner_file_commands":"/github/file_commands" -v "/home/runner/work/mikebronner/mikebronner":"/github/workspace" wakareadmestats/waka-readme-stats:master
[54](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:55)
Traceback (most recent call last):
[55](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:56)
  File "/waka-readme-stats/main.py", line 221, in <module>
[56](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:57)
    run(main())
[57](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:58)
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run
[58](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:59)
    return runner.run(main)
[59](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:60)
           ^^^^^^^^^^^^^^^^
[33](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:34)
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run
[34](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:35)
    return self._loop.run_until_complete(task)
[35](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:36)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[36](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:37)
  File "/usr/local/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
[37](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:38)
    return future.result()
[38](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:39)
           ^^^^^^^^^^^^^^^
[39](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:40)
  File "/waka-readme-stats/main.py", line 208, in main
[40](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:41)
    stats = await get_stats()
[41](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:42)
            ^^^^^^^^^^^^^^^^^
[42](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:43)
  File "/waka-readme-stats/main.py", line 156, in get_stats
[43](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:44)
    yearly_data, commit_data = await calculate_commit_data(repositories)
[44](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:45)
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[45](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:46)
  File "/waka-readme-stats/yearly_commit_calculator.py", line 37, in calculate_commit_data
[46](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:47)
    await update_data_with_commit_stats(repo, yearly_data, date_data)
[47](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:48)
  File "/waka-readme-stats/yearly_commit_calculator.py", line 63, in update_data_with_commit_stats
[48](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:49)
    commit_data = await DM.get_remote_graphql("repo_commit_list", owner=owner, name=repo_details["name"], branch=branch["name"], id=GHM.USER.node_id)
[49](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:50)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[50](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:51)
  File "/waka-readme-stats/manager_download.py", line 293, in get_remote_graphql
[51](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:52)
    res = await DownloadManager._fetch_graphql_paginated(query, **kwargs)
[52](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:53)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[53](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:54)
  File "/waka-readme-stats/manager_download.py", line 267, in _fetch_graphql_paginated
[54](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:55)
    initial_query_response = await DownloadManager._fetch_graphql_query(query, **kwargs, pagination="first: 100")
[55](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:56)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[56](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:57)
  File "/waka-readme-stats/manager_download.py", line 231, in _fetch_graphql_query
[57](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:58)
    raise Exception(f"Query '{query}' failed to run by returning code of {res.status_code}: {res.json()}")
[58](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:59)
Exception: Query 'repo_commit_list' failed to run by returning code of 403: {'documentation_url': 'https://docs.github.com/en/free-pro-team@latest/rest/overview/resources-in-the-rest-api#secondary-rate-limits', 'message': 'You have exceeded a secondary rate limit. Please wait a few minutes before you try again.'}
[59](https://github.com/mikebronner/mikebronner/actions/runs/4456747155/jobs/7835938139#step:3:60)
sys:1: RuntimeWarning: coroutine 'AsyncClient.get' was never awaited

@UnixBear
Copy link

I've been getting this error for about a month now:

TypeError: 'NoneType' object is not subscriptable
sys:1: RuntimeWarning: coroutine 'AsyncClient.get' was never awaited

Same as the users above.

@aberdayy
Copy link

aberdayy commented Mar 21, 2023

I'm getting this error like all the users above.
/usr/bin/docker run --name wakareadmestatswakareadmestatsmaster_c7aeaf --label 6c0442 --workdir /github/workspace --rm -e "INPUT_WAKATIME_API_KEY" -e "INPUT_GH_TOKEN" -e "INPUT_SHOW_COMMIT" -e "INPUT_SHOW_LINES_OF_CODE" -e "INPUT_SHOW_LANGUAGE" -e "INPUT_SHOW_TIMEZONE" -e "INPUT_SHOW_LOC_CHART" -e "INPUT_SHOW_PROFILE_VIEWS" -e "INPUT_SHOW_SHORT_INFO" -e "INPUT_SHOW_OS" -e "INPUT_SHOW_LANGUAGE_PER_REPO" -e "INPUT_SYMBOL_VERSION" -e "INPUT_SECTION_NAME" -e "INPUT_PULL_BRANCH_NAME" -e "INPUT_PUSH_BRANCH_NAME" -e "INPUT_SHOW_PROJECTS" -e "INPUT_SHOW_EDITORS" -e "INPUT_SHOW_DAYS_OF_WEEK" -e "INPUT_SHOW_UPDATED_DATE" -e "INPUT_SHOW_TOTAL_CODE_TIME" -e "INPUT_COMMIT_BY_ME" -e "INPUT_COMMIT_MESSAGE" -e "INPUT_COMMIT_USERNAME" -e "INPUT_COMMIT_EMAIL" -e "INPUT_COMMIT_SINGLE" -e "INPUT_LOCALE" -e "INPUT_UPDATED_DATE_FORMAT" -e "INPUT_IGNORED_REPOS" -e "INPUT_DEBUG_LOGGING" -e "HOME" -e "GITHUB_JOB" -e "GITHUB_REF" -e "GITHUB_SHA" -e "GITHUB_REPOSITORY" -e "GITHUB_REPOSITORY_OWNER" -e "GITHUB_REPOSITORY_OWNER_ID" -e "GITHUB_RUN_ID" -e "GITHUB_RUN_NUMBER" -e "GITHUB_RETENTION_DAYS" -e "GITHUB_RUN_ATTEMPT" -e "GITHUB_REPOSITORY_ID" -e "GITHUB_ACTOR_ID" -e "GITHUB_ACTOR" -e "GITHUB_TRIGGERING_ACTOR" -e "GITHUB_WORKFLOW" -e "GITHUB_HEAD_REF" -e "GITHUB_BASE_REF" -e "GITHUB_EVENT_NAME" -e "GITHUB_SERVER_URL" -e "GITHUB_API_URL" -e "GITHUB_GRAPHQL_URL" -e "GITHUB_REF_NAME" -e "GITHUB_REF_PROTECTED" -e "GITHUB_REF_TYPE" -e "GITHUB_WORKFLOW_REF" -e "GITHUB_WORKFLOW_SHA" -e "GITHUB_WORKSPACE" -e "GITHUB_ACTION" -e "GITHUB_EVENT_PATH" -e "GITHUB_ACTION_REPOSITORY" -e "GITHUB_ACTION_REF" -e "GITHUB_PATH" -e "GITHUB_ENV" -e "GITHUB_STEP_SUMMARY" -e "GITHUB_STATE" -e "GITHUB_OUTPUT" -e "RUNNER_OS" -e "RUNNER_ARCH" -e "RUNNER_NAME" -e "RUNNER_TOOL_CACHE" -e "RUNNER_TEMP" -e "RUNNER_WORKSPACE" -e "ACTIONS_RUNTIME_URL" -e "ACTIONS_RUNTIME_TOKEN" -e "ACTIONS_CACHE_URL" -e GITHUB_ACTIONS=true -e CI=true -v "/var/run/docker.sock":"/var/run/docker.sock" -v "/home/runner/work/_temp/_github_home":"/github/home" -v "/home/runner/work/_temp/_github_workflow":"/github/workflow" -v "/home/runner/work/_temp/_runner_file_commands":"/github/file_commands" -v "/home/runner/work/aberdayy/aberdayy":"/github/workspace" wakareadmestats/waka-readme-stats:master

Traceback (most recent call last): File "/waka-readme-stats/main.py", line 221, in <module> run(main())

File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run return runner.run(main)

File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run return self._loop.run_until_complete(task)

File "/usr/local/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete return future.result()

File "/waka-readme-stats/main.py", line 210, in main GHM.update_readme(stats) File "/waka-readme-stats/manager_github.py", line 113, in update_readme new_readme = sub(GitHubManager._README_REGEX, readme_stats, readme_contents)

File "/usr/local/lib/python3.11/re/__init__.py", line 185, in sub return _compile(pattern, flags).sub(repl, string, count)

File "/usr/local/lib/python3.11/re/__init__.py", line 317, in _subx template = _compile_repl(template, pattern)

File "/usr/local/lib/python3.11/re/__init__.py", line [30](https://github.com/aberdayy/aberdayy/actions/runs/4477378118/jobs/7868828805#step:3:31)8, in _compile_repl return _parser.parse_template(repl, pattern)

File "/usr/local/lib/python3.11/re/_parser.py", line 1078, in parse_template raise s.error('bad escape %s' % this, len(this)) from None

re.error: bad escape \U at position 1874 (line 40, column 3)
sys:1: RuntimeWarning: coroutine 'AsyncClient.get' was never awaited

@willnaoosmith
Copy link

Having the same issue for weeks too!

@pseusys
Copy link
Collaborator

pseusys commented Mar 28, 2023

@ddok2, @eby8zevin, @Fanduzi, @moncheeta, @yanskun, @UnixBear, @willnaoosmith
Could anyone of you please run the action with debug logs enabled?
Apparently we have some commits that are becoming None, I would like to see in what repositories and on what branches do the commits exist.

@willnaoosmith
Copy link

willnaoosmith commented Mar 28, 2023

Done!
Here's the part of the log that I think is the important one:

[...]

Preparing metadata (pyproject.toml): finished with status 'done'
ERROR: Ignored the following versions that require a different python version: 1.21.2 Requires-Python >=3.7,<3.11; 1.21.3 Requires-Python >=3.7,<3.11; 1.21.4 Requires-Python >=3.7,<3.11; 1.21.5 Requires-Python >=3.7,<3.11; 1.21.6 Requires-Python >=3.7,<3.11
ERROR: Could not find a version that satisfies the requirement opencv-python==4.2.0.34 (from versions: 3.4.0.14, 3.4.10.37, 3.4.11.39, 3.4.11.41, 3.4.11.43, 3.4.11.45, 3.4.13.47, 3.4.15.55, 3.4.16.57, 3.4.16.59, 3.4.17.61, 3.4.17.63, 3.4.18.65, 4.3.0.38, 4.4.0.40, 4.4.0.42, 4.4.0.44, 4.4.0.46, 4.5.1.48, 4.5.3.56, 4.5.4.58, 4.5.4.60, 4.5.5.62, 4.5.5.64, 4.6.0.66, 4.7.0.68, 4.7.0.72)
ERROR: No matching distribution found for opencv-python==4.2.0.34
The command '/bin/sh -c pip install -r requirements.txt' returned a non-zero code: 1

Warning: Docker build failed with exit code 1, back off 2.58 seconds before retry.
/usr/bin/docker build -t 6c0442:7c019a37c0d94605a8feced2f4c32637 -f "/home/runner/work/_actions/anmol098/waka-readme-stats/V3/Dockerfile" "/home/runner/work/_actions/anmol098/waka-readme-stats/V3"

[...]

 Error: Docker build failed with exit code 1
##[debug]System.InvalidOperationException: Docker build failed with exit code 1
##[debug]   at GitHub.Runner.Worker.ActionManager.BuildActionContainerAsync(IExecutionContext executionContext, Object data)
##[debug]   at GitHub.Runner.Worker.JobExtensionRunner.RunAsync()
##[debug]   at GitHub.Runner.Worker.StepsRunner.RunStepAsync(IStep step, CancellationToken jobCancellationToken)
##[debug]Finishing: Build anmol098/waka-readme-stats@V3

[...]

Looks like it's a simple dependency issue.
Tried running this command (pip install opencv-python==4.2.0.34) on my own computer and the error is the same, even that this version does exist here.

[EDIT]
BTW, the code above is using V3, not master like @ddok2 does.

Here's the log for the master version, which gives 'Async is never awaited' error:

[...]

Traceback (most recent call last):
  File "/waka-readme-stats/main.py", line 221, in <module>
    run(main())
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/waka-readme-stats/main.py", line 208, in main
    stats = await get_stats()
            ^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/main.py", line 156, in get_stats
    yearly_data, commit_data = await calculate_commit_data(repositories)
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/yearly_commit_calculator.py", line 38, in calculate_commit_data
    await update_data_with_commit_stats(repo, yearly_data, date_data)
  File "/waka-readme-stats/yearly_commit_calculator.py", line 64, in update_data_with_commit_stats
    commit_data = await DM.get_remote_graphql("repo_commit_list", owner=owner, name=repo_details["name"], branch=branch["name"], id=GHM.USER.node_id)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/manager_download.py", line 293, in get_remote_graphql
    res = await DownloadManager._fetch_graphql_paginated(query, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/manager_download.py", line 267, in _fetch_graphql_paginated
    initial_query_response = await DownloadManager._fetch_graphql_query(query, **kwargs, pagination="first: 100")
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/manager_download.py", line 229, in _fetch_graphql_query
    return await DownloadManager._fetch_graphql_query(query, retries_count - 1, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/manager_download.py", line 229, in _fetch_graphql_query
    return await DownloadManager._fetch_graphql_query(query, retries_count - 1, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/manager_download.py", line 229, in _fetch_graphql_query
    return await DownloadManager._fetch_graphql_query(query, retries_count - 1, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  [Previous line repeated 7 more times]
  File "/waka-readme-stats/manager_download.py", line 231, in _fetch_graphql_query
    raise Exception(f"Query '{query}' failed to run by returning code of {res.status_code}: {res.json()}")
Exception: Query 'repo_commit_list' failed to run by returning code of 502: {'data': None, 'errors': [{'message': 'Something went wrong while executing your query. This may be the result of a timeout, or it could be a GitHub bug. Please include `7028:69F2:240738D:49FE50C:642310BC` when reporting this issue.'}]}
sys:1: RuntimeWarning: coroutine 'AsyncClient.get' was never awaited
##[debug]Docker Action run completed with exit code 1
##[debug]Finishing: Generate Waka Stats

[...]

And, here's my YAML code:

[...]

runs-on: ubuntu-latest

[...]

- name: Generate Waka Stats
  uses: anmol098/waka-readme-stats@master
  with:
    WAKATIME_API_KEY: ${{ secrets.WAKATIME_API_KEY }}
    GH_TOKEN: ${{ secrets.GH_TOKEN }}
    SHOW_PROJECTS: "False"
    SHOW_LOC_CHART: "False" 
    SHOW_PROFILE_VIEWS: "False" 
    SHOW_LANGUAGE_PER_REPO: "False"
    SHOW_COMMIT: "False"
    SHOW_DAYS_OF_WEEK: "False"
    SHOW_TIMEZONE: "False"
    SHOW_UPDATED_DATE: "False"
    SHOW_LINES_OF_CODE: "True"
    LOCALE: "en"

[...]

@iamgojoof6eyes
Copy link

iamgojoof6eyes commented Apr 1, 2023

Done! Here's the part of the log that I think is the important one:

[...]

Preparing metadata (pyproject.toml): finished with status 'done'
ERROR: Ignored the following versions that require a different python version: 1.21.2 Requires-Python >=3.7,<3.11; 1.21.3 Requires-Python >=3.7,<3.11; 1.21.4 Requires-Python >=3.7,<3.11; 1.21.5 Requires-Python >=3.7,<3.11; 1.21.6 Requires-Python >=3.7,<3.11
ERROR: Could not find a version that satisfies the requirement opencv-python==4.2.0.34 (from versions: 3.4.0.14, 3.4.10.37, 3.4.11.39, 3.4.11.41, 3.4.11.43, 3.4.11.45, 3.4.13.47, 3.4.15.55, 3.4.16.57, 3.4.16.59, 3.4.17.61, 3.4.17.63, 3.4.18.65, 4.3.0.38, 4.4.0.40, 4.4.0.42, 4.4.0.44, 4.4.0.46, 4.5.1.48, 4.5.3.56, 4.5.4.58, 4.5.4.60, 4.5.5.62, 4.5.5.64, 4.6.0.66, 4.7.0.68, 4.7.0.72)
ERROR: No matching distribution found for opencv-python==4.2.0.34
The command '/bin/sh -c pip install -r requirements.txt' returned a non-zero code: 1

Warning: Docker build failed with exit code 1, back off 2.58 seconds before retry.
/usr/bin/docker build -t 6c0442:7c019a37c0d94605a8feced2f4c32637 -f "/home/runner/work/_actions/anmol098/waka-readme-stats/V3/Dockerfile" "/home/runner/work/_actions/anmol098/waka-readme-stats/V3"

[...]

 Error: Docker build failed with exit code 1
##[debug]System.InvalidOperationException: Docker build failed with exit code 1
##[debug]   at GitHub.Runner.Worker.ActionManager.BuildActionContainerAsync(IExecutionContext executionContext, Object data)
##[debug]   at GitHub.Runner.Worker.JobExtensionRunner.RunAsync()
##[debug]   at GitHub.Runner.Worker.StepsRunner.RunStepAsync(IStep step, CancellationToken jobCancellationToken)
##[debug]Finishing: Build anmol098/waka-readme-stats@V3

[...]

Looks like it's a simple dependency issue. Tried running this command (pip install opencv-python==4.2.0.34) on my own computer and the error is the same, even that this version does exist here.

[EDIT] BTW, the code above is using V3, not master like @ddok2 does.

Here's the log for the master version, which gives 'Async is never awaited' error:

[...]

Traceback (most recent call last):
  File "/waka-readme-stats/main.py", line 221, in <module>
    run(main())
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/waka-readme-stats/main.py", line 208, in main
    stats = await get_stats()
            ^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/main.py", line 156, in get_stats
    yearly_data, commit_data = await calculate_commit_data(repositories)
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/yearly_commit_calculator.py", line 38, in calculate_commit_data
    await update_data_with_commit_stats(repo, yearly_data, date_data)
  File "/waka-readme-stats/yearly_commit_calculator.py", line 64, in update_data_with_commit_stats
    commit_data = await DM.get_remote_graphql("repo_commit_list", owner=owner, name=repo_details["name"], branch=branch["name"], id=GHM.USER.node_id)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/manager_download.py", line 293, in get_remote_graphql
    res = await DownloadManager._fetch_graphql_paginated(query, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/manager_download.py", line 267, in _fetch_graphql_paginated
    initial_query_response = await DownloadManager._fetch_graphql_query(query, **kwargs, pagination="first: 100")
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/manager_download.py", line 229, in _fetch_graphql_query
    return await DownloadManager._fetch_graphql_query(query, retries_count - 1, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/manager_download.py", line 229, in _fetch_graphql_query
    return await DownloadManager._fetch_graphql_query(query, retries_count - 1, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/manager_download.py", line 229, in _fetch_graphql_query
    return await DownloadManager._fetch_graphql_query(query, retries_count - 1, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  [Previous line repeated 7 more times]
  File "/waka-readme-stats/manager_download.py", line 231, in _fetch_graphql_query
    raise Exception(f"Query '{query}' failed to run by returning code of {res.status_code}: {res.json()}")
Exception: Query 'repo_commit_list' failed to run by returning code of 502: {'data': None, 'errors': [{'message': 'Something went wrong while executing your query. This may be the result of a timeout, or it could be a GitHub bug. Please include `7028:69F2:240738D:49FE50C:642310BC` when reporting this issue.'}]}
sys:1: RuntimeWarning: coroutine 'AsyncClient.get' was never awaited
##[debug]Docker Action run completed with exit code 1
##[debug]Finishing: Generate Waka Stats

[...]

And, here's my YAML code:

[...]

runs-on: ubuntu-latest

[...]

- name: Generate Waka Stats
  uses: anmol098/waka-readme-stats@master
  with:
    WAKATIME_API_KEY: ${{ secrets.WAKATIME_API_KEY }}
    GH_TOKEN: ${{ secrets.GH_TOKEN }}
    SHOW_PROJECTS: "False"
    SHOW_LOC_CHART: "False" 
    SHOW_PROFILE_VIEWS: "False" 
    SHOW_LANGUAGE_PER_REPO: "False"
    SHOW_COMMIT: "False"
    SHOW_DAYS_OF_WEEK: "False"
    SHOW_TIMEZONE: "False"
    SHOW_UPDATED_DATE: "False"
    SHOW_LINES_OF_CODE: "True"
    LOCALE: "en"

[...]

Same error I am facing too. My yaml is quite different

@RedBoardDev
Copy link

Hi ! I've the same error and I don't know how fix that..

@doctormin
Copy link

doctormin commented Apr 5, 2023

In my case, time-out errors happened when fetching the commit history of a large repo of mine
So this bug can be temporarily avoided by turning:

SHOW_COMMIT flag to "False" in your GitHub Action YML file

You guys can give it a try: example

@eby8zevin
Copy link

@doctormin I used all flag your settings, and it worked. Thank you

@iamgojoof6eyes
Copy link

iamgojoof6eyes commented Apr 5, 2023

@doctormin Thank you for giving advice...however it hasn't worked for me but when I noticed that the error in the running is due to the local language we have defined in the YML file for me it was LOCALE: "en_IN" since it defaults to English, so I just removed it and it started to work with SHOW_COMMIT flag set to "True"

Here is my YML file

@willnaoosmith
Copy link

willnaoosmith commented Apr 5, 2023

@iamgojoof6eyes same here.
I already had the SHOW_COMMIT flag to "False" from the start, so it wasn't working.
It also didn't work after setting the SHOW_COMMIT flag to "True".

Error:

File "/waka-readme-stats/main.py", line 156, in get_stats
  yearly_data, commit_data = await calculate_commit_data(repositories)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/waka-readme-stats/yearly_commit_calculator.py", line 38, in calculate_commit_data
  await update_data_with_commit_stats(repo, yearly_data, date_data)
File "/waka-readme-stats/yearly_commit_calculator.py", line 64, in update_data_with_commit_stats
  commit_data = await DM.get_remote_graphql("repo_commit_list", owner=owner, name=repo_details["name"], branch=branch["name"], id=GHM.USER.node_id)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/waka-readme-stats/manager_download.py", line 293, in get_remote_graphql
  res = await DownloadManager._fetch_graphql_paginated(query, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/waka-readme-stats/manager_download.py", line 267, in _fetch_graphql_paginated
  initial_query_response = await DownloadManager._fetch_graphql_query(query, **kwargs, pagination="first: 100")
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/waka-readme-stats/manager_download.py", line 229, in _fetch_graphql_query
  return await DownloadManager._fetch_graphql_query(query, retries_count - 1, **kwargs)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/waka-readme-stats/manager_download.py", line 229, in _fetch_graphql_query
  return await DownloadManager._fetch_graphql_query(query, retries_count - 1, **kwargs)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/waka-readme-stats/manager_download.py", line 229, in _fetch_graphql_query
  return await DownloadManager._fetch_graphql_query(query, retries_count - 1, **kwargs)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[Previous line repeated 7 more times]
File "/waka-readme-stats/manager_download.py", line 231, in _fetch_graphql_query
  raise Exception(f"Query '{query}' failed to run by returning code of {res.status_code}: {res.json()}")
Exception: Query 'repo_commit_list' failed to run by returning code of 502: {'data': None, 'errors': [{'message': 'Something went wrong while executing your query. This may be the result of a timeout, or it could be a GitHub bug. Please include `4481:3E96:1CE9FF:1DCD42:642D4A29` when reporting this issue.'}]}

Maybe it's because my profile readme has almost 2.7K commits?

[EDIT]: Same error message with SHOW_COMMITS flag set to either "False" or "True".

Here's my YML file

@iamgojoof6eyes
Copy link

iamgojoof6eyes commented Apr 5, 2023

@iamgojoof6eyes same here. I already had the SHOW_COMMIT flag to "False" from the start, so it wasn't working. It also didn't work after setting the SHOW_COMMIT flag to "True".

Error:

File "/waka-readme-stats/main.py", line 156, in get_stats
  yearly_data, commit_data = await calculate_commit_data(repositories)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/waka-readme-stats/yearly_commit_calculator.py", line 38, in calculate_commit_data
  await update_data_with_commit_stats(repo, yearly_data, date_data)
File "/waka-readme-stats/yearly_commit_calculator.py", line 64, in update_data_with_commit_stats
  commit_data = await DM.get_remote_graphql("repo_commit_list", owner=owner, name=repo_details["name"], branch=branch["name"], id=GHM.USER.node_id)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/waka-readme-stats/manager_download.py", line 293, in get_remote_graphql
  res = await DownloadManager._fetch_graphql_paginated(query, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/waka-readme-stats/manager_download.py", line 267, in _fetch_graphql_paginated
  initial_query_response = await DownloadManager._fetch_graphql_query(query, **kwargs, pagination="first: 100")
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/waka-readme-stats/manager_download.py", line 229, in _fetch_graphql_query
  return await DownloadManager._fetch_graphql_query(query, retries_count - 1, **kwargs)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/waka-readme-stats/manager_download.py", line 229, in _fetch_graphql_query
  return await DownloadManager._fetch_graphql_query(query, retries_count - 1, **kwargs)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/waka-readme-stats/manager_download.py", line 229, in _fetch_graphql_query
  return await DownloadManager._fetch_graphql_query(query, retries_count - 1, **kwargs)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[Previous line repeated 7 more times]
File "/waka-readme-stats/manager_download.py", line 231, in _fetch_graphql_query
  raise Exception(f"Query '{query}' failed to run by returning code of {res.status_code}: {res.json()}")
Exception: Query 'repo_commit_list' failed to run by returning code of 502: {'data': None, 'errors': [{'message': 'Something went wrong while executing your query. This may be the result of a timeout, or it could be a GitHub bug. Please include `4481:3E96:1CE9FF:1DCD42:642D4A29` when reporting this issue.'}]}

Maybe it's because my profile readme has almost 2.7K commits?

[EDIT]: Same error message with SHOW_COMMITS flag set to either "False" or "True".

Here's my YML file

Your error is quite different from my error.

@willnaoosmith
Copy link

@iamgojoof6eyes Please, share your error message.
If you want to, try enabling the debug logs, by creating a secret on your repository with the name ACTIONS_STEP_DEBUG and its value set to true, then run the action and share the logs.

ddok2 added a commit to ddok2/ddok2 that referenced this issue Apr 6, 2023
@ddok2
Copy link
Author

ddok2 commented Apr 6, 2023

@pseusys I did debug mode as you said. That's how I got this log: LOG
If you look at 69 line of log, it seems like error occurs while it receives private repo. How can I get the name of this private repo?

here's my YAML code:

jobs:
  update-readme:
    name: Update Readme with Metrics
    runs-on: ubuntu-latest
    steps:
      - uses: anmol098/waka-readme-stats@master
        with:
          WAKATIME_API_KEY: ${{ secrets.WAKATIME_API_KEY }}
          GH_TOKEN: ${{ secrets.GH_TOKEN }}
          SHOW_OS: "True"
          SHOW_PROJECTS: "False"
          SHOW_PROFILE_VIEWS: "False"
          SHOW_EDITORS: "True"
          SHOW_LANGUAGE_PER_REPO: "False"
          SHOW_LOC_CHART: "False"
          SHOW_LINES_OF_CODE: "True"
          SHOW_COMMIT: "False"
          SHOW_SHORT_INFO: "False"

@pseusys
Copy link
Collaborator

pseusys commented Apr 6, 2023

@ddok2 So, this behavior is not expected.
If you take a look at GraphQL GitHub API documentation on commit the committedDate field appears to be not nullable (DateTime!) there, so I would like to examinate the issue before applying any patches and/or hotfixes.
There are two options on investigating the issue further:

  1. You can run the action locally, modifying code around this line in order to find the repository name and commit ID.
  2. You can run the action queries described in this file on this website - but this might take a lot of time.

Unfortunately I can't run the queries for you because the repository in question is private and so I with my GitHub token don't have access to it.

@lgc2333
Copy link

lgc2333 commented Apr 11, 2023

#443

really strange...
image

image

@lgc2333
Copy link

lgc2333 commented Apr 11, 2023

error reappears when i run this locally, and it gives me this error:

{
  "errors": [
    {
      "type": "FORBIDDEN",
      "path": [
        "user",
        "repositories",
        "nodes",
        3
      ],
      "extensions": {
        "saml_failure": false
      },
      "locations": [
        {
          "line": 5,
          "column": 13
        }
      ],
      "message": "`LiteLDev` forbids access via a personal access token (classic). Please use a GitHub App, OAuth App, or a personal access token with fine-grained permissions."
    },
    {
      "type": "FORBIDDEN",
      "path": [
        "user",
        "repositories",
        "nodes",
        18
      ],
      "extensions": {
        "saml_failure": false
      },
      "locations": [
        {
          "line": 5,
          "column": 13
        }
      ],
      "message": "`LiteLDev` forbids access via a personal access token (classic). Please use a GitHub App, OAuth App, or a personal access token with fine-grained permissions."
    }
  ]
}

and the value of these resps is None

[
  {
    "primaryLanguage": {
      "name": "TypeScript"
    },
    "name": "fuck-cors",
    "owner": {
      "login": "lgc2333"
    },
    "isPrivate": false
  },
  null,  // HERE
  {
    "primaryLanguage": {
      "name": "Python"
    },
    "name": "nonebot_template_plugin",
    "owner": {
      "login": "Ikaros-521"
    },
    "isPrivate": false
  },
]

@pseusys
Copy link
Collaborator

pseusys commented Apr 12, 2023

@lgc2333 Ok, so here we can see that this organisation has prohibited access to its repositories to the users with old personal access tokens (classic).
I think we should just filter out None repository objects (as they seem to be able to appear for various reasons) and log error messages. As for now, I would suggest using new fine-grained access tokens.

@tanjuntao
Copy link

@doctormin Thanks for your sharing, your YML file works for me.

@lgc2333
Copy link

lgc2333 commented Apr 16, 2023

@lgc2333 Ok, so here we can see that this organisation has prohibited access to its repositories to the users with old personal access tokens (classic). I think we should just filter out None repository objects (as they seem to be able to appear for various reasons) and log error messages. As for now, I would suggest using new fine-grained access tokens.

i can't use fine-grained access tokens in this action

Exception: Query 'user_repository_list' failed to run by returning code of 401: {'message': 'Personal access tokens with fine grained access do not support the GraphQL API', 'documentation_url': 'https://docs.github.com/graphql/guides/forming-calls-with-graphql#authenticating-with-graphql'}

https://github.com/lgc2333/lgc2333/actions/runs/4713861175/jobs/8359827916

@ddok2
Copy link
Author

ddok2 commented Apr 29, 2023

@pseusys
I ran it in my local as you said.
I found the following error
This doesn't happen all the time, but 8 out of 10 times.
That's weird.

The error came from this line of code

image

@pseusys
Copy link
Collaborator

pseusys commented Apr 29, 2023

@ddok2, check out this pr, it might fix your isuue: #449

@ddok2
Copy link
Author

ddok2 commented Apr 30, 2023

@pseusys ,still get error on [this pr: #449]
Getting the repository information just fine,
but have None in commit_data: dict via graphql("repo_commit_list") at yearly_commit_calculator.py#L65

I modified the code as below and it works fine. (just add None filter at repo_commit_list)

...
for branch in branch_data["data"]["repository"]["refs"]["nodes"]:
        commit_data = await DM.get_remote_graphql("repo_commit_list", owner=owner, name=repo_details["name"], branch=branch["name"], id=GHM.USER.node_id)
        for commit in [c for c in commit_data["data"]["repository"]["ref"]["target"]["history"]["nodes"] if c is not None]:
            date = search(r"\d+-\d+-\d+", commit["committedDate"]).group()
...

is it okay for me to commit to this pr[#449]?

@pseusys
Copy link
Collaborator

pseusys commented Apr 30, 2023

@ddok2, could you please inspect, how does it happen that committedDate is None?
According to GitHub documentation, it's a required field.
If it becomes None sometimes, we can use the None check you propose, and I would suggest writing to GitHub support about the issue.
Because in general in my opinion we would rather rely on documentation then doublecheck everything.

@aravindvnair99 aravindvnair99 assigned pseusys and unassigned anmol098 Apr 30, 2023
@ddok2
Copy link
Author

ddok2 commented May 1, 2023

@pseusys, I agree with your comment. I'll inspect the issue again.
If this is a problem with Gtihub, I will open an issue with Github support and close this issue.

@ddok2
Copy link
Author

ddok2 commented May 3, 2023

This issue was a GitHub problem.
DM.get_remote_graphql returned None when it was 503 Service Unavailable.
I'm closing this issue,
if anyone has the same issue for a different reason, please reopen it.

@ddok2 ddok2 closed this as completed May 3, 2023
@mikebronner
Copy link

mikebronner commented May 3, 2023

Yea, this is failing for me across the board, as I previously described, with the following stack-trace:

/usr/bin/docker run --name wakareadmestatswakareadmestatsmaster_9e351b --label ed866e --workdir /github/workspace --rm -e "INPUT_WAKATIME_API_KEY" -e "INPUT_GH_TOKEN" -e "INPUT_SHOW_PROJECTS" -e "INPUT_SECTION_NAME" -e "INPUT_PULL_BRANCH_NAME" -e "INPUT_PUSH_BRANCH_NAME" -e "INPUT_SHOW_OS" -e "INPUT_SHOW_EDITORS" -e "INPUT_SHOW_TIMEZONE" -e "INPUT_SHOW_COMMIT" -e "INPUT_SHOW_LANGUAGE" -e "INPUT_SHOW_LINES_OF_CODE" -e "INPUT_SHOW_LANGUAGE_PER_REPO" -e "INPUT_SHOW_LOC_CHART" -e "INPUT_SHOW_DAYS_OF_WEEK" -e "INPUT_SHOW_PROFILE_VIEWS" -e "INPUT_SHOW_SHORT_INFO" -e "INPUT_SHOW_UPDATED_DATE" -e "INPUT_SHOW_TOTAL_CODE_TIME" -e "INPUT_COMMIT_BY_ME" -e "INPUT_COMMIT_MESSAGE" -e "INPUT_COMMIT_USERNAME" -e "INPUT_COMMIT_EMAIL" -e "INPUT_COMMIT_SINGLE" -e "INPUT_LOCALE" -e "INPUT_UPDATED_DATE_FORMAT" -e "INPUT_IGNORED_REPOS" -e "INPUT_SYMBOL_VERSION" -e "INPUT_DEBUG_LOGGING" -e "HOME" -e "GITHUB_JOB" -e "GITHUB_REF" -e "GITHUB_SHA" -e "GITHUB_REPOSITORY" -e "GITHUB_REPOSITORY_OWNER" -e "GITHUB_REPOSITORY_OWNER_ID" -e "GITHUB_RUN_ID" -e "GITHUB_RUN_NUMBER" -e "GITHUB_RETENTION_DAYS" -e "GITHUB_RUN_ATTEMPT" -e "GITHUB_REPOSITORY_ID" -e "GITHUB_ACTOR_ID" -e "GITHUB_ACTOR" -e "GITHUB_TRIGGERING_ACTOR" -e "GITHUB_WORKFLOW" -e "GITHUB_HEAD_REF" -e "GITHUB_BASE_REF" -e "GITHUB_EVENT_NAME" -e "GITHUB_SERVER_URL" -e "GITHUB_API_URL" -e "GITHUB_GRAPHQL_URL" -e "GITHUB_REF_NAME" -e "GITHUB_REF_PROTECTED" -e "GITHUB_REF_TYPE" -e "GITHUB_WORKFLOW_REF" -e "GITHUB_WORKFLOW_SHA" -e "GITHUB_WORKSPACE" -e "GITHUB_ACTION" -e "GITHUB_EVENT_PATH" -e "GITHUB_ACTION_REPOSITORY" -e "GITHUB_ACTION_REF" -e "GITHUB_PATH" -e "GITHUB_ENV" -e "GITHUB_STEP_SUMMARY" -e "GITHUB_STATE" -e "GITHUB_OUTPUT" -e "RUNNER_OS" -e "RUNNER_ARCH" -e "RUNNER_NAME" -e "RUNNER_TOOL_CACHE" -e "RUNNER_TEMP" -e "RUNNER_WORKSPACE" -e "ACTIONS_RUNTIME_URL" -e "ACTIONS_RUNTIME_TOKEN" -e "ACTIONS_CACHE_URL" -e GITHUB_ACTIONS=true -e CI=true -v "/var/run/docker.sock":"/var/run/docker.sock" -v "/home/runner/work/_temp/_github_home":"/github/home" -v "/home/runner/work/_temp/_github_workflow":"/github/workflow" -v "/home/runner/work/_temp/_runner_file_commands":"/github/file_commands" -v "/home/runner/work/mikebronner/mikebronner":"/github/workspace" wakareadmestats/waka-readme-stats:master
Traceback (most recent call last):
  File "/waka-readme-stats/main.py", line 221, in <module>
    run(main())
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/waka-readme-stats/main.py", line 208, in main
    stats = await get_stats()
            ^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/main.py", line 156, in get_stats
    yearly_data, commit_data = await calculate_commit_data(repositories)
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/yearly_commit_calculator.py", line 38, in calculate_commit_data
    await update_data_with_commit_stats(repo, yearly_data, date_data)
  File "/waka-readme-stats/yearly_commit_calculator.py", line 64, in update_data_with_commit_stats
    commit_data = await DM.get_remote_graphql("repo_commit_list", owner=owner, name=repo_details["name"], branch=branch["name"], id=GHM.USER.node_id)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/manager_download.py", line 293, in get_remote_graphql
    res = await DownloadManager._fetch_graphql_paginated(query, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/manager_download.py", line 271, in _fetch_graphql_paginated
    query_response = await DownloadManager._fetch_graphql_query(query, **kwargs, pagination=pagination)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/manager_download.py", line 229, in _fetch_graphql_query
    return await DownloadManager._fetch_graphql_query(query, retries_count - 1, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/manager_download.py", line 231, in _fetch_graphql_query
    raise Exception(f"Query '{query}' failed to run by returning code of {res.status_code}: {res.json()}")
Exception: Query 'repo_commit_list' failed to run by returning code of 403: {'documentation_url': 'https://docs.github.com/en/free-pro-team@latest/rest/overview/resources-in-the-rest-api#secondary-rate-limits', 'message': 'You have exceeded a secondary rate limit. Please wait a few minutes before you try again.'}
sys:1: RuntimeWarning: coroutine 'AsyncClient.get' was never awaited

Seems like this is due to rate limit issues? I never had this happen until 11 Mar 2023. Is there something I can do to avoid the rate limit?

@willnaoosmith
Copy link

I think the problem is that once the repository has enough commits, it takes too long for GitHub to process the data, so it returns an 503 error.
Re-doing the repository, cleaning the commit history might do it.

Mine has 2.720 commits as now.
@mikebronner has 1K.
So it might be a new limitation on the GitHub API, or something like this.

If anyone tries to fix it by doing my suggestion, let us know!

@pseusys
Copy link
Collaborator

pseusys commented May 4, 2023

I think the problem is that once the repository has enough commits, it takes too long for GitHub to process the data, so it returns an 503 error.
Re-doing the repository, cleaning the commit history might do it.

Mine has 2.720 commits as now.
@mikebronner has 1K.
So it might be a new limitation on the GitHub API, or something like this.

If anyone tries to fix it by doing my suggestion, let us know!

I think we fhould file an issue to GitHub support, shouldn't we?

@willnaoosmith
Copy link

I think the problem is that once the repository has enough commits, it takes too long for GitHub to process the data, so it returns an 503 error.
Re-doing the repository, cleaning the commit history might do it.
Mine has 2.720 commits as now.
@mikebronner has 1K.
So it might be a new limitation on the GitHub API, or something like this.
If anyone tries to fix it by doing my suggestion, let us know!

I think we fhould file an issue to GitHub support, shouldn't we?

Yup
both would be good.
Once I got some time, I'm going to try what I suppose might work, just to get some intel on the cause of the problem and help a little!

@willnaoosmith
Copy link

I think the problem is that once the repository has enough commits, it takes too long for GitHub to process the data, so it returns an 503 error.
Re-doing the repository, cleaning the commit history might do it.
Mine has 2.720 commits as now.
@mikebronner has 1K.
So it might be a new limitation on the GitHub API, or something like this.
If anyone tries to fix it by doing my suggestion, let us know!

I think we fhould file an issue to GitHub support, shouldn't we?

Yup both would be good. Once I got some time, I'm going to try what I suppose might work, just to get some intel on the cause of the problem and help a little!

Tried running the code after cleaning the repository commit history using this:

git checkout --orphan latest_branch
git add -A
git commit -am "init"
git branch -D master
git branch -m master
git push -f origin master

and… it didn't work

One thing that I did see is that the code is fetching some repositories that aren't mine (I have 58, he's fetching 64).
And the code is stuck on the same repository every time I run the action:

image

Is there a way/option to make it fetch only repositories on my personal profile? Looks like it's fetching a repository I helped on, but since it's private, I can't know which is it.

@pseusys pseusys reopened this May 9, 2023
@pseusys
Copy link
Collaborator

pseusys commented May 9, 2023

@willnaoosmith, there's no such option yet, but we are always open to ideas and/or proposals!

@thenithinbalaji
Copy link
Contributor

I am facing the same issue

Traceback (most recent call last):
  File "/waka-readme-stats/main.py", line 221, in <module>
    run(main())
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/waka-readme-stats/main.py", line 208, in main
    stats = await get_stats()
            ^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/main.py", line 153, in get_stats
    repositories = await collect_user_repositories()
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/main.py", line 1[32](https://github.com/thenithinbalaji/thenithinbalaji/actions/runs/5074867020/jobs/9115515061#step:3:33), in collect_user_repositories
    repo_names = [repo["name"] for repo in repositories["data"]["user"]["repositories"]["nodes"]]
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/waka-readme-stats/main.py", line 132, in <listcomp>
    repo_names = [repo["name"] for repo in repositories["data"]["user"]["repositories"]["nodes"]]
                  ~~~~^^^^^^^^
TypeError: 'NoneType' object is not subscriptable
sys:1: RuntimeWarning: coroutine 'AsyncClient.get' was never awaited

Reading the issue discussion, I understand this is a problem with GitHub API and not this workflow. Am I right?

@mikebronner
Copy link

I opened a ticket with Github for this issue to see what suggestions they might have. I received the following response, likely what has been known already:

In order to provide quality service on GitHub, additional rate limits may apply to some actions when calling our API. For example, using the API to rapidly create content, poll aggressively instead of using webhooks, make multiple concurrent requests, or repeatedly request data that is computationally expensive may result in secondary rate-limiting. The algorithm behind anti-abuse rate limits is not that simple, so we can't publish specific details in our public documentation. I'm sure you would also agree that having them published defeats its purpose.

In general, these secondary rate limits are in place to prevent bad actors on GitHub and some work based on the CPU time used by the query. We know you don't have such intentions, but we need to protect the API from the few bad citizens that are out there. For the specific limiter you are hitting, there isn't a way I can guide you to the point you could always avoid hitting the limits. We can't predict with certainty how many CPU seconds a certain request will consume. For example, today it might be 1 second, tomorrow it might be 1.5, and the day after 0.5, depending on various factors (e.g. system load).

As you can't know in advance how many seconds your requests will consume, it's not possible for you to avoid these limits completely. Instead, you need to be aware of them, spread out your requests more, and also cover the possibility of hitting them, and then back off for a bit. For GraphQL specifically you could consider reducing the number of requested results in order to reduce the risk of running into a secondary rate limit.

Hoping this helps? It sounds like the only solution is to create some form of elaborate query mechanism that reduces batch size and increases delays between batches until it finds a sweet spot.

@pseusys
Copy link
Collaborator

pseusys commented May 2, 2024

Yeah, honestly, I can't come up with a really good solution. Since we are speaking about an action, that runs on GitHub servers as well, increasing execution time might lead to some rate limits of GitHub-hosted action time showing up. I don't think there is a way of using GraphQL queries in a more efficient way than we do now (however, if you do know such a way, we all would be grateful for any ideas).
Another solution that comes to my mind (still far from being perfect tho) is trying to implement file cache. We can store all the information we have once fetched in a binary cache file (e.g. using python pickle module) and then query only the repositories, branches and commits that were created and/or modified after the last time the cache file was modified. After every fetch the cache file should be updated accordingly. Personally I haven't checked if that could reduce request number greatly, but since we already keep and commit binary (image) files - that might be a decent solution.

@mikebronner
Copy link

@pseusys thanks for responding with your insights. I'm not at all experienced in GraphQL, so I probably am not able to contribute much other than ideas, but I do like you caching idea. It sounds like a decent way of working with it.

Further, that could probably be set up to start caching on the first run, retain the cache when it errors out, then use it for the second run, until it can complete without errors. Would be cool if on rate limit error, the action could be rescheduled for an hour or two later to run again, but if not, it would just take a few days to get caught up.

@pseusys
Copy link
Collaborator

pseusys commented May 2, 2024

Well, I don't think we are really able to reschedule actions, only delay within GitHub action execution time limit (which is like 5 hours max I guess). Yes, and in case of an error, we can indeed, proceed with the results we have obtained, cache them and maybe display a warning badge at the bottom of the readme, like this:

Warning

The action wasn't able to retrieve all user data during latest run due to GitHub internal API limitations. We hope that the info gathered in the next run will be more complete!

And during the next run we can use cache for the repos/PRs/commits/branches that were not updated since.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet