You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jun 10, 2021. It is now read-only.
Hello. I wonder if I can use arbitrary language models (e.g., an n-gram LM) to help decoding using the shallow fusion strategy. It seems that currently OpenNMT only supports shallow fusion with RNNLM trained by itself.
Thank you in advance.
The text was updated successfully, but these errors were encountered:
yes - as of today, we only support onmt rnnlm since code is integrated. through hooks mechanism it would not be hard to extend to other LM though. Do you have one in particular in mind?
Hello. I wonder if I can use arbitrary language models (e.g., an n-gram LM) to help decoding using the shallow fusion strategy. It seems that currently OpenNMT only supports shallow fusion with RNNLM trained by itself.
Thank you in advance.
The text was updated successfully, but these errors were encountered: