You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, the doc says Argos Translate uses SentencePiece (and maybe Sacremoses?) for tokenization and Stanza for sentence boundary detection. I'm wondering whether it is possible to translate pre-split and pre-tokenized sentences (a list of lists of tokens), in which case I could drop many dependencies of Argos Translate, since there are many problems concerning the strict version pin of dependencies (cf. #362, #395).
The text was updated successfully, but these errors were encountered:
Hi, the doc says Argos Translate uses
SentencePiece
(and maybeSacremoses
?) for tokenization andStanza
for sentence boundary detection. I'm wondering whether it is possible to translate pre-split and pre-tokenized sentences (a list of lists of tokens), in which case I could drop many dependencies of Argos Translate, since there are many problems concerning the strict version pin of dependencies (cf. #362, #395).The text was updated successfully, but these errors were encountered: