Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hitting GH rate-limit when generating changelog over very large amount of commits #1422

Closed
dariye opened this issue Jul 30, 2020 · 29 comments
Closed
Labels
bug Something isn't working

Comments

@dariye
Copy link

dariye commented Jul 30, 2020

Describe the bug

The auto shipit command somehow manages to exceed GH's API rate limit when trying to generate a changelog post release. (See screenshots)

To Reproduce

Run auto shipit

To have this error, you need to have a repository that has no releases but has many many many prs and commits, so essentially high velocity codebase.

Expected behavior

A new version is released with a changelog and release.

Screenshots

Screenshot_2020-07-30_at_12_01_49

Screenshot_2020-07-30_at_12_01_18

Environment information:

Environment Information:

"auto" version: v9.49.1
"git"  version: v2.26.0
"node" version: v13.12.0

Project Information:

✔ Repository:      project (​https://github.com/username/project​)
✔ Author Name:     Paul Dariye
✔ Author Email:    paul.dariye@gmail.com
✔ Current Version: v0.0.23
✔ Latest Release:  0.0.1 (​https://github.com/username/project/releases/tag/0.0.1​)

✖ Labels configured on GitHub project (Try running "auto create-labels")

GitHub Token Information:

✔ Token:            [Token starting with 76e5]
✔ Repo Permission:  admin
✔ User:             dariye
✔ API:              undefined (​undefined​)
✔ Enabled Scopes:   read:packages, repo, write:packages
✔ Rate Limit:       552/5000

Additional context

I was able to get around this by passing the --no-changelog flag.

I traced the error to this line of code

const data = await Promise.all(

@dariye dariye added the bug Something isn't working label Jul 30, 2020
@hipstersmoothie
Copy link
Collaborator

Any chance I could see the repo to play around with?

@hipstersmoothie
Copy link
Collaborator

wow thats a lot of requests

@hipstersmoothie
Copy link
Collaborator

Also if I could get a fuller log that might help

@hipstersmoothie
Copy link
Collaborator

Was actually able to recreate locally. Don't need anything from you other than hope 😉

@gr2m
Copy link

gr2m commented Jul 30, 2020

We do create a lot of similar requests in semantic-release, and we didn't hit the abuse limit, not that I'm aware of. There should be a 3s timeout between requests that create a comment, do you see that happening in your logs?

@gr2m
Copy link

gr2m commented Jul 30, 2020

Just to be sure: do you use a single Octokit instance across all these requests?

@hipstersmoothie
Copy link
Collaborator

hipstersmoothie commented Jul 30, 2020

Pretty sure we are only creating 1.

That happens here https://github.com/intuit/auto/blob/master/packages/core/src/git.ts#L123

which should only be initialized once at startup https://github.com/intuit/auto/blob/master/packages/core/src/auto.ts#L1643

I can check though

Verfied: only created once

@hipstersmoothie
Copy link
Collaborator

hipstersmoothie commented Jul 30, 2020

To test this issue on auto:

yarn
yarn build
yarn auto changelog --from v1.0.0 -d

@hipstersmoothie
Copy link
Collaborator

I think this is related to octokit/plugin-throttling.js#108 but we only have 1 instance of octokit running so we shouldn't need to cluster

@gr2m
Copy link

gr2m commented Jul 30, 2020

do you do any GraphQL requests?

Can you confirm that you see the "went over the abuse limits" logs occur at a rate of roughly 1 per 3 seconds?

@hipstersmoothie
Copy link
Collaborator

Ah that might be it. we use @octokit/graphql but I saw today that you can do it straight though octokit. will try that too

@gr2m
Copy link

gr2m commented Jul 30, 2020

we use @octokit/graphql but I saw today that you can do it straight though octokit

Yes, that way it shares the same request settings & request life cycle hooks

@hipstersmoothie
Copy link
Collaborator

🙏

@hipstersmoothie
Copy link
Collaborator

Okay running now. Got an hour on my rate limit wait left though 😢

@vincentbriglia
Copy link
Contributor

@hipstersmoothie you can use our github instance if you're hitting this - we are seeing the same thing

@hipstersmoothie
Copy link
Collaborator

Can you link me to the build log?

@vincentbriglia
Copy link
Contributor

I have sent you the build log, as this is on a closed repo.

@hipstersmoothie
Copy link
Collaborator

@vincentbriglia I'm pretty sure that i've fixed the problem in your logs in #1424

Could you install the canary version? It's all under a different scope so you'll have to replace all the package names:

yarn add @auto-canary/auto@9.49.2-canary.1424.17767.0

yarn add @auto-canary/all-contributors@9.49.2-canary.1424.17767.0
yarn add @auto-canary/conventional-commits@9.49.2-canary.1424.17767.0
yarn add @auto-canary/first-time-contributor@9.49.2-canary.1424.17767.0
yarn add @auto-canary/released@9.49.2-canary.1424.17767.0

@dariye If you could try to test too I would appreciate it!

@vincentbriglia
Copy link
Contributor

Tried it out with

"@auto-canary/all-contributors": "9.49.2-canary.1424.17767.0",
"@auto-canary/auto": "9.49.2-canary.1424.17767.0",
"@auto-canary/conventional-commits": "9.49.2-canary.1424.17767.0",
"@auto-canary/first-time-contributor": "9.49.2-canary.1424.17767.0",
"@auto-canary/npm": "9.49.2-canary.1424.17767.0",
"@auto-canary/released": "9.49.2-canary.1424.17767.0",

still an issue with the github runner GITHUB_TOKEN

trying with a personal token now (I have seen some discrepancies before)

@vincentbriglia
Copy link
Contributor

didn't work with a personal token either.

@vincentbriglia
Copy link
Contributor

I gave you access to the private repo @hipstersmoothie - branch from next branch

@hipstersmoothie
Copy link
Collaborator

Thanks! Was able to solve your issue pretty quick. d6e7be2

I don't think this is the same issue @dariye is having though.

@vincentbriglia
Copy link
Contributor

confirmed that that fixed the issue I was having @hipstersmoothie - have a nice weekend !

@hipstersmoothie hipstersmoothie changed the title Auto shipit hitting GH's API rate limit when trying to generate changelog post release Hitting GH rate-limit when generating changelog over very large amount of commits Jul 31, 2020
@vincentbriglia
Copy link
Contributor

however @hipstersmoothie this change now stopped publishing and/or correctly calculating semver releases.

@hipstersmoothie
Copy link
Collaborator

I think auto is behaving as expected. I alluded to this here. This is because the conventional commits plugin treat all non-semver commit messages as skip-release (ex: chore, docs, etc).

This is the PR that implemented it #1086

Would you prefer for the conventional commit plugin to not do this? (eg: don't skip for doc/chore/etc)

@dariye
Copy link
Author

dariye commented Aug 3, 2020

@vincentbriglia I'm pretty sure that i've fixed the problem in your logs in #1424

Could you install the canary version? It's all under a different scope so you'll have to replace all the package names:

yarn add @auto-canary/auto@9.49.2-canary.1424.17767.0

yarn add @auto-canary/all-contributors@9.49.2-canary.1424.17767.0
yarn add @auto-canary/conventional-commits@9.49.2-canary.1424.17767.0
yarn add @auto-canary/first-time-contributor@9.49.2-canary.1424.17767.0
yarn add @auto-canary/released@9.49.2-canary.1424.17767.0

@dariye If you could try to test too I would appreciate it!

I'll test it today and get back to you.

@dariye
Copy link
Author

dariye commented Aug 6, 2020

@hipstersmoothie this is still an issue when generating a changelog.

So everything seemed to work when I had the --no-changelog flag. However, I took it off and we just try to use auto shipit in the CI and it fails with the same rate limit hit error.

@dariye
Copy link
Author

dariye commented Aug 6, 2020

@hipstersmoothie I added a little more context ☝️

@dariye
Copy link
Author

dariye commented Aug 20, 2020

@hipstersmoothie thanks a lot for looking into this. I think the recent releases have fixed this. I'll go ahead and close this. auto's working pretty nicely for us now. Thanks again

@dariye dariye closed this as completed Aug 20, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants