Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CodeCamp2023-471] Support for automatic selection of a batch size that maximizes the utilization of GPU memory. #1354

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

LALBJ
Copy link

@LALBJ LALBJ commented Sep 11, 2023

Motivation

Modifying the auto batch size to prevent OOM errors. See more details in #1220

Modification

To facilitate the automatic selection of batch sizes in the Runner, include the auto_batchsize parameter. The Runner will examine the batch size and execute the process using the appropriate value.

To implement this feature, introduce a '_check_batchsize' method within the runner.train() function. This method will constantly monitor the execution for OOM errors. Whenever an OOM error is encountered, the batch size will be divided by 2.

Checklist

  1. Pre-commit or other linting tools are used to fix the potential lint issues.
  2. The modification is covered by complete unit tests. If not, please add more unit test to ensure the correctness.
  3. If the modification has potential influence on downstream projects, this PR should be tested with downstream projects, like MMDet or MMCls.
  4. The documentation has been modified accordingly, like docstring or example tutorials.

@CLAassistant
Copy link

CLAassistant commented Sep 11, 2023

CLA assistant check
All committers have signed the CLA.

@LALBJ LALBJ changed the title feature: implement auto batchsize [CodeCamp2023] Support for automatic selection of a batch size that maximizes the utilization of GPU memory. Sep 11, 2023
@LALBJ LALBJ changed the title [CodeCamp2023] Support for automatic selection of a batch size that maximizes the utilization of GPU memory. [CodeCamp2023-471] Support for automatic selection of a batch size that maximizes the utilization of GPU memory. Sep 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants