metadata
title: Replicate Inference Provider Examples
emoji: 🚀
colorFrom: blue
colorTo: green
sdk: static
pinned: true
Replicate Inference Provider Examples
Use Replicate through Hugging Face's standard InferenceClient by setting provider="replicate". These examples use your HF_TOKEN and models available through Hugging Face Inference Providers.
Image generation
import os
from huggingface_hub import InferenceClient
client = InferenceClient(
provider="replicate",
api_key=os.environ["HF_TOKEN"],
)
image = client.text_to_image(
"A cinematic photo of an astronaut riding a horse",
model="Tongyi-MAI/Z-Image-Turbo",
)
image.save("replicate-astronaut.png")
Image editing
import os
from huggingface_hub import InferenceClient
client = InferenceClient(
provider="replicate",
api_key=os.environ["HF_TOKEN"],
)
with open("cat.png", "rb") as image_file:
input_image = image_file.read()
image = client.image_to_image(
input_image,
prompt="Turn the cat into a tiger.",
model="black-forest-labs/FLUX.2-dev",
)
image.save("replicate-tiger.png")
Video generation
import os
from huggingface_hub import InferenceClient
client = InferenceClient(
provider="replicate",
api_key=os.environ["HF_TOKEN"],
)
video = client.text_to_video(
"A young man walking on the street",
model="Wan-AI/Wan2.2-T2V-A14B",
)
Speech recognition
import os
from huggingface_hub import InferenceClient
client = InferenceClient(
provider="replicate",
api_key=os.environ["HF_TOKEN"],
)
output = client.automatic_speech_recognition(
"sample1.flac",
model="openai/whisper-large-v3",
)