Issues with xtts_v2 batched inference #3713
Unanswered
Rakshith12-pixel
asked this question in
General Q&A
Replies: 2 comments
-
@Rakshith12-pixel Hello! Could you write me? I think i can help you. My email: [email protected] |
Beta Was this translation helpful? Give feedback.
0 replies
-
@2Bye @Rakshith12-pixel did you guys find anything on this ? Thanks ! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am working with xtts_v2
I have implemented batched inference on xtts_v2. However, if each text(sentence) in the batch is of different lenght, then I face the following issues:
Even the xtts_demo.py performs single example feed-forward even if we give longer sentence after splitting it.
I'd be very helful if anyone could explain why these issues happen and help me implement a proper batched inference for xtts_v2 model
Beta Was this translation helpful? Give feedback.
All reactions