-
|
Is there a way to use quantized or distilled models like: https://huggingface.co/ALM/whisper-it-small or https://huggingface.co/Sandiago21/whisper-large-v2-italian (perhaps quantized to q8_0) ? |
Beta Was this translation helpful? Give feedback.
Answered by
Purfview
Dec 6, 2024
Replies: 1 comment
-
|
Those are not "quantized" nor "distilled ". If you convert those fine-tuned models to ctranslate2 format then you can use them. |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
Purfview
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Those are not "quantized" nor "distilled ".
If you convert those fine-tuned models to ctranslate2 format then you can use them.