-
Notifications
You must be signed in to change notification settings - Fork 471
Need help to release a pre-trained model from a GPU server #521
Comments
Thanks a lot, shahbazsyed . [icuser@xxx OpenNMT-Torch]$ th translate.lua -model model_en_vi_epoch19_9.12_release.t7 -src train_snli.txt_textSingles -output train_snli.txt_textSingles_VN |
@an-fbk Can you share your files ? I can try to reproduce the same error |
@shahbazsyed sure, you can download my source-language file at https://drive.google.com/file/d/1-jT18jgyEHutuXfV93QrWDKE4UKSJ6pH/view?usp=sharing |
@shahbazsyed Hi, do you have any findings regarding the error I posted before? |
Hi, |
@shahbazsyed the source of model is from here (https://gist.github.com/tuan3w/e18d7b4587ed374610bfbaea17bb3f07) |
No there was no error when I released it. I couldn't have shared it otherwise. |
Hi everyone,
I'm a newbie here and looking for your help.
I have a public pre-trained model from a GPU server (download here https://drive.google.com/drive/folders/0BzY0S4QyX701OFJfbkZ3NmhTb1E). I want to use this model to translate some texts. However, I have no GPU server, and according to the instruction here (http://opennmt.net/OpenNMT/translation/inference/), this model needs to be released on a GPU server so that the released model can be used for inference on a CPU server.
Can anyone help me? I would deeply appreciate it.
Bests,
An Vo
The text was updated successfully, but these errors were encountered: