Can the inference be speed up on Apple Silicon? #3810
Unanswered
YofarDev
asked this question in
General Q&A
Replies: 1 comment
-
Any updates? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
I'm on macos (M2 Pro 16Go), and the generating time is like 20s for a simple sentence. Is it normal? Any way to optimize it, or do you know any other models, softwares... that would decrease the inference time?
I'm using this model to clone a voice from a sample :
tts = TTS("tts_models/multilingual/multi-dataset/xtts_v2").to(device)
Beta Was this translation helpful? Give feedback.
All reactions