Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] Is it possible to change the init_args of list elements with giving ActionConfigFile #478

Open
vinnamkim opened this issue Mar 25, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@vinnamkim
Copy link

vinnamkim commented Mar 25, 2024

This is a minimal reproduction. If I have this Python code which implements jsonargparse.ArgumentParser

# test.py

from lightning.pytorch.cli import LRSchedulerCallable, ReduceLROnPlateau
from jsonargparse import ArgumentParser, ActionConfigFile

class Model:
    def __init__(
        self,
        schedulers: list[LRSchedulerCallable] = lambda optim: ReduceLROnPlateau(
            optim,
            monitor="val/map50",
        ),
    ):
        self.schedulers = schedulers


if __name__ == "__main__":
    parser = ArgumentParser()
    parser.add_argument("--model", type=Model)
    parser.add_argument("--config", action=ActionConfigFile)
    cfg = parser.parse_args()
    print(cfg)

, and this YAML configuration file for it.

# test.yaml

model:
  class_path: __main__.Model
  init_args:
    schedulers:
      - class_path: torch.optim.lr_scheduler.OneCycleLR
        init_args:
          max_lr: 0.1
      - class_path: lightning.pytorch.cli.ReduceLROnPlateau
        init_args:
          monitor: val/custom
          mode: max
          factor: 0.1
          patience: 4

What I want to is to override the monitor value of ReduceLROnPlateau without changing the other configuration values in the YAML file. For example,

python test.py --config test.yaml --model.schedulers.monitor val/mAP50 --print_config

The expect output I want is

model:
  class_path: __main__.Model
  init_args:
    schedulers:
    - class_path: torch.optim.lr_scheduler.OneCycleLR
      init_args:
        max_lr: 0.1
        ...
    - class_path: lightning.pytorch.cli.ReduceLROnPlateau
      init_args:
        monitor: val/mAP50
        mode: max
        factor: 0.1
        patience: 4

However, what I got is

model:
  class_path: __main__.Model
  init_args:
    schedulers:
    - class_path: torch.optim.lr_scheduler.OneCycleLR
      init_args:
        max_lr: 0.1
        ...
    - class_path: lightning.pytorch.cli.ReduceLROnPlateau
      init_args:
        monitor: val/mAP50
        mode: min
        factor: 0.1
        patience: 10
        threshold: 0.0001
        threshold_mode: rel
        cooldown: 0
        min_lr: 0.0
        eps: 1.0e-08
        verbose: false

The other values of ReduceLROnPlateau are reset to its default, not ones in the configuration file.

Is there any method to allow this kind of overriding?

@mauvilsa
Copy link
Member

I would consider this a bug. The behavior of a class instance type hint, when overriding a single init arg of the class, is only to change that arg and keep all other init_args as before. But this seems to not be working for callables that return a class instance. The behavior must be consistent for these two cases.

Note that in #460 there is a request to always reset to defaults. If this gets implemented still the default behavior should be the overriding that you want.

Thank you for reporting!

@mauvilsa
Copy link
Member

Initially I had though that overriding of arguments was not working correctly for all callable types that return a class. However, after another look I noticed that without the list, i.e. scheduler: LRSchedulerCallable, the overriding works correctly. Thus, the issue is only when the type is nested. Unfortunately this might mean it is more difficult to fix.

Another comment unrelated to the bug. In the reproduction code the default for schedulers is invalid because it is not a list of callables.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants