Full Wan2.1 14B VACE model, converted from fp16 to fp8_scaled, using this script.
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for spacepxl/Wan2.1_VACE_14B_fp8_scaled
Base model
Wan-AI/Wan2.1-VACE-14B