Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Need help] How to init an object of a class with position parameters in yaml config file? Also how to call a func to init an object in yaml file? #407

Open
shenmishajing opened this issue Oct 22, 2023 · 4 comments
Labels
enhancement New feature or request

Comments

@shenmishajing
Copy link

馃殌 Feature request

We can init an object with keyword parameters already, which is very helpful. An example:

  encoders:
      class_path: torch.nn.ModuleDict
      init_args:
          modules:
              table:
                class_path: torch.nn.TransformerEncoder
                init_args:
                    encoder_layer:
                        class_path: torch.nn.TransformerEncoderLayer
                        init_args:
                            d_model: 128
                            nhead: 128
                            activation: relu
                            batch_first: true
                    num_layers: 2

But, what should I do if I want to use a torch.nn.Sequential or other class requiring position parameters instead of ModuleDict? The init func of nn.Sequential:

    @overload
    def __init__(self, *args: Module) -> None:
        ...

    @overload
    def __init__(self, arg: 'OrderedDict[str, Module]') -> None:
        ...

    def __init__(self, *args):
        super(Sequential, self).__init__()
        if len(args) == 1 and isinstance(args[0], OrderedDict):
            for key, module in args[0].items():
                self.add_module(key, module)
        else:
            for idx, module in enumerate(args):
                self.add_module(str(idx), module)

And, how to use a func to init an object? For example, I have a func like this:

# encoder.py
def init_encoder(dim):
  return nn.Sequential(nn.Linear(dim, dim), nn.Relu())

What should I do if I want something like:

  encoders:
      class_path: torch.nn.ModuleDict
      init_args:
          modules:
              table:
                method_path: encoder.init_encoder
                init_args:
                    dim: 128
@shenmishajing shenmishajing added the enhancement New feature or request label Oct 22, 2023
@mauvilsa
Copy link
Member

what should I do if I want to use a torch.nn.Sequential

Variable positional arguments *args currently is not possible to specify in a config file. However, you can easily implement your Sequential class to support it, like:

import torch.nn

class Sequential(torch.nn.Sequential):
    def __init__(self, modules: list[Module]):
        super().__init__(*modules)

Then in a config file:

class_path: your.module.Sequential
init_args:
  modules:
  - class_path: module1
  - ...

how to use a func to init an object?

This has been requested in many issues. Right now I don't remember where else it has been requested to reference here. But it is not yet supported, and it is not a simple feature to implement. For the time being, this is only possible by using classes-from-functions.

@shenmishajing
Copy link
Author

@mauvilsa Ok, got it. Thanks for your reply very much. But, is there any plan to support those two features, the features about init objects with position parameters and objects from function? Thanks for you gays' great efforts for this useful lib again.

@mauvilsa
Copy link
Member

mauvilsa commented Nov 1, 2023

All features are potential future additions, unless stated otherwise. But note that the list of possible features is large, and I don't mean only the ones publicly seen in issues. There is also a private list of things that can be added/improved. What gets implemented and when depends on many factors.

@shenmishajing
Copy link
Author

Thanks for your explaination and all you guys' great work. I can not wait until the day when I get the news that those features are gotten supported. :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants