| library_name: diffusers | |
| tags: | |
| - fp8 | |
| - safetensors | |
| - converted-by-gradio | |
| # FP8 Model Conversion | |
| - **Source**: `https://huggingface.co/Kijai/WanVideo_comfy/OneToAllAnimation` | |
| - **Original File(s)**: `Wan21-OneToAllAnimation_1_3B_v2_fp16.safetensors` | |
| - **Original Format**: `safetensors` | |
| - **FP8 Format**: `E5M2` | |
| - **FP8 File**: `Wan21-OneToAllAnimation_1_3B_v2_fp16-fp8-e5m2.safetensors` | |
| ## Usage | |
| ```python | |
| from safetensors.torch import load_file | |
| import torch | |
| # Load FP8 model | |
| fp8_state = load_file("Wan21-OneToAllAnimation_1_3B_v2_fp16-fp8-e5m2.safetensors") | |
| # Convert tensors back to float32 for computation (auto-converted by PyTorch) | |
| model.load_state_dict(fp8_state) | |
| ``` | |
| > **Note**: FP8 tensors are automatically converted to float32 when loaded in PyTorch. | |
| > Requires PyTorch ≥ 2.1 for FP8 support. | |
| ## Statistics | |
| - **Total tensors**: 1329 | |
| - **Converted to FP8**: 1329 | |
| - **Skipped (non-float)**: 0 | |