Instructions to use feizhai123/flux2-dev-modelopt-fp8 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Diffusers
How to use feizhai123/flux2-dev-modelopt-fp8 with Diffusers:
pip install -U diffusers transformers accelerate
import torch from diffusers import DiffusionPipeline from diffusers.utils import load_image # switch to "mps" for apple devices pipe = DiffusionPipeline.from_pretrained("feizhai123/flux2-dev-modelopt-fp8", dtype=torch.bfloat16, device_map="cuda") prompt = "Turn this cat into a dog" input_image = load_image("https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/cat.png") image = pipe(image=input_image, prompt=prompt).images[0] - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- f466a6e2bbe7d9a7e292aa163bcf6c054a1af47a524ff4ac77af780eb5978f7c
- Size of remote file:
- 17.1 MB
- SHA256:
- b4692e474c0c44763a1c02821695d87add551d977a728f464656c102beb2d3e3
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.