Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reduce code redundancy in Pre-trained modules and the corresponding unittests #221

Open
gpengzhi opened this issue Oct 1, 2019 · 0 comments
Labels
enhancement New feature or request topic: modules Issue about built-in Texar modules

Comments

@gpengzhi
Copy link
Collaborator

gpengzhi commented Oct 1, 2019

"What worries me is that our implementation of modules based on pre-trained stuff is a bit too repetitive, so that a seemingly small change would require modifying a bunch of files. This is also true for a lot of tests (not limited to pre-trained ones). Let's keep this in mind so we can improve this in the future."

Originally posted by @huzecong in #220

For the code redundancy in the models, I feel that we can extract some task-specific headers for all or most of the pre-trained models. For the testing issue, some of the tests can be shared across different models. I think we can write a common test script that can be used by a group of classes for unit testing.

@gpengzhi gpengzhi added enhancement New feature or request topic: modules Issue about built-in Texar modules labels Oct 1, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request topic: modules Issue about built-in Texar modules
Projects
None yet
Development

No branches or pull requests

1 participant