You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
One of the scripts in the examples/ folder of Accelerate or an officially supported no_trainer script in the examples folder of the transformers repo (such as run_no_trainer_glue.py)
My own task or dataset (give details below)
Reproduction
accelerate config
Set GPU count (num_processes) to 3
Other configurations such as DeepSpeed or FSDP do not seem to matter
There is no way of running accelerate test without setting num_processes to 1, 2, 4 etc. since split_batches is enabled and batch_size is hardcoded to 8.
Its expected that accelerate test works regardless of GPU / process count
The text was updated successfully, but these errors were encountered:
System Info
Information
Tasks
no_trainer
script in theexamples
folder of thetransformers
repo (such asrun_no_trainer_glue.py
)Reproduction
accelerate/src/accelerate/test_utils/scripts/test_script.py
Line 189 in aa21174
Expected behavior
There is no way of running
accelerate test
without setting num_processes to 1, 2, 4 etc. since split_batches is enabled and batch_size is hardcoded to 8.Its expected that
accelerate test
works regardless of GPU / process countThe text was updated successfully, but these errors were encountered: