Replies: 3 comments 2 replies
-
|
Have you tried wan2gp https://github.com/deepbeepmeep/Wan2GP? It is already optimized for low vram/ram machines. I do want to add gguf support at some time and fp8 should already work with some models. |
Beta Was this translation helpful? Give feedback.
-
|
Pre quantized Fp8 should load fine now in the WanX-i2v tab. Fp8 scaled is better quality though. |
Beta Was this translation helpful? Give feedback.
-
|
To use fp8 e4m3fn model select a e4m3fn dit model and select use fp8. To use fp8 scaled select a full fp16 model then click Use fp8 scaled, it will scale it on generation. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Firstly I would like to appreciate the creator for a great tool.
As a casual user I need a support for pre-quantized FP8 and GGUF models. I have a weak machine and i2v cause crash even with turned on swapping and built-in FP8 functions. Also this support could help save a disk space for those who doesn't want to download 28-32 GB for checkpoint.
Beta Was this translation helpful? Give feedback.
All reactions