Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

babbler-1900 model doesnt work! #123

Open
hkoohy opened this issue Feb 18, 2022 · 1 comment
Open

babbler-1900 model doesnt work! #123

hkoohy opened this issue Feb 18, 2022 · 1 comment

Comments

@hkoohy
Copy link

hkoohy commented Feb 18, 2022

Hi, thank you for this nice and helpful work.
The example code doesnt work with any other options for pretrained model except 'bert-base'! I am especially interested in using 'babbler-1000' but I get an error saying 'cannot unpack non-iterable NoneType object'!

'''
from tape import ProteinBertModel, TAPETokenizer
model = ProteinBertModel.from_pretrained('babbler-1900') ##('bert-base')
tokenizer = TAPETokenizer(vocab='unirep') # iupac is the vocab for TAPE models, use unirep for the UniRep model

Pfam Family: Hexapep, Clan: CL0536

sequence = 'GCTVEDRCLIGMGAILLNGCVIGSGSLVAAGALITQ'
token_ids = torch.tensor([tokenizer.encode(sequence)])
output = model(token_ids)
sequence_output = output[0]
pooled_output = output[1]
'''

The error I get is:

Model name 'babbler-1900' was not found in model name list (bert-base). We assumed 'babbler-1900' was a path or url but couldn't find any file associated to this path or url.

TypeError Traceback (most recent call last)
Input In [25], in
1 from tape import ProteinBertModel, TAPETokenizer
----> 2 model = ProteinBertModel.from_pretrained('babbler-1900') ##('bert-base')
3 tokenizer = TAPETokenizer(vocab='unirep') # iupac is the vocab for TAPE models, use unirep for the UniRep model
5 # Pfam Family: Hexapep, Clan: CL0536

File ~/opt/anaconda3/envs/tape_env/lib/python3.8/site-packages/tape/models/modeling_utils.py:478, in ProteinModel.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
476 # Load config
477 if config is None:
--> 478 config, model_kwargs = cls.config_class.from_pretrained(
479 pretrained_model_name_or_path, *model_args,
480 cache_dir=cache_dir, return_unused_kwargs=True,
481 # force_download=force_download,
482 # resume_download=resume_download,
483 **kwargs
484 )
485 else:
486 model_kwargs = kwargs

TypeError: cannot unpack non-iterable NoneType object

@franzigeiger
Copy link

You have to use UniRepModel instead of ProteinBertModel

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants