Issues with xtts_v2 batched inference #3713
Unanswered
Rakshith12-pixel
asked this question in
General Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am working with xtts_v2
I have implemented batched inference on xtts_v2. However, if each text(sentence) in the batch is of different lenght, then I face the following issues:
Even the xtts_demo.py performs single example feed-forward even if we give longer sentence after splitting it.
I'd be very helful if anyone could explain why these issues happen and help me implement a proper batched inference for xtts_v2 model
Beta Was this translation helpful? Give feedback.
All reactions