New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
There is a limit of 1000 results per search. #824
Comments
Here is a workaround demonstrating how to retrieve all pull requests in a range of dates, even if there are more than 1000 results: EDIT: I will rewrite this to be a method that yields, rather than a class, will be simpler
With this class, you can now do this sort of thing:
|
Reading the Github API docs about search, I also notice that |
Now that I have PyGithub forked and running locally from source (I'm looking at #606) perhaps I can investigate this further. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
re-open this issue ? |
Does anybody have a solution? This is blocking us from exploring the marketplace. |
Got the same problem in 1.55 😩 |
You can retrieve over 1,000 results by also following this method similar to what BBI-YggyKing mentioned, but this is without using the API. However, it may not return all of the results. |
Check out https://github.com/oscarpobletes/GitHubMines ! This is an extraction tool that allows you to perform a search on GitHub and bypass some limits established by GitHub GraphQL API. |
The GitHub API limits searches to 1000 results. This limit affects searches performed via PyGitHub, such as GitHub.search_issues.
It seems there is no indication that your search has hit this limit - there is no exception or error that I am aware of. Perhaps there should be an exception raised when this happens (if it can be detected).
It is possible to work around this limit by issuing multiple search queries, but such queries must be tailored to suit the particular goals of the query - for example iterating over search_issues by progressive date ranges - and I cannot think of a way to generalise this.
Any thoughts on how to address this? Is there a general solution?
Note that this issue has nothing to do with rate limiting or pagination of results.
The text was updated successfully, but these errors were encountered: